mirror of
https://github.com/anasty17/mirror-leech-telegram-bot.git
synced 2025-01-05 10:36:52 +08:00
Add dynamic ffmpeg cmds using list of lists of cmds
- Change in some sabnzb options to use webui with reverse proxy by disabling some warnings - Remove duplicated jobs instead of pasue - Fix some other minor issues in nzb related to eta - Add user default token option in usettings so use can directly use his rclone.conf/token.pickle without writting mtp: or mrcc: close #1520 Signed-off-by: anasty17 <e.anastayyar@gmail.com>
This commit is contained in:
parent
e6e71eff6c
commit
82ba68a414
44
README.md
44
README.md
@ -57,9 +57,20 @@ programming in Python.
|
||||
on [Sreeraj](https://github.com/SVR666) searchX-bot. (task option)
|
||||
- Stop Duplicates (global and user option)
|
||||
- Custom upload destination (global, user, and task option)
|
||||
- Ability to choose token.pickle or service acccount and upload destinations from list with or without buttons (global, user and task option)
|
||||
- Index link support only
|
||||
for [Bhadoo](https://gitlab.com/GoogleDriveIndex/Google-Drive-Index/-/blob/master/src/worker.js)
|
||||
|
||||
## Rclone
|
||||
|
||||
- Transfer (download/upload/clone-server-side) without or with random service accounts (global and user option)
|
||||
- Ability to choose config, remote and path from list with or without buttons (global, user and task option)
|
||||
- Ability to set flags for each task or globally from config (global, user and task option)
|
||||
- Abitity to select specific files or folders to download/copy using buttons (task option)
|
||||
- Rclone.conf (global and user option)
|
||||
- Rclone serve for combine remote to use it as index from all remotes (global option)
|
||||
- Upload destination (global, user and task option)
|
||||
|
||||
## Status
|
||||
|
||||
- Download/Upload/Extract/Archive/Seed/Clone Status
|
||||
@ -122,16 +133,6 @@ programming in Python.
|
||||
- Sudo settings to control users feeds
|
||||
- All functions have been improved using buttons from one command.
|
||||
|
||||
## Rclone
|
||||
|
||||
- Transfer (download/upload/clone-server-side) without or with random service accounts (global and user option)
|
||||
- Ability to choose config, remote and path from list with or without buttons (global, user and task option)
|
||||
- Ability to set flags for each task or globally from config (global, user and task option)
|
||||
- Abitity to select specific files or folders to download/copy using buttons (task option)
|
||||
- Rclone.conf (global and user option)
|
||||
- Rclone serve for combine remote to use it as index from all remotes (global option)
|
||||
- Upload destination (global, user and task option)
|
||||
|
||||
## Overall
|
||||
|
||||
- Docker image support for linux `amd64, arm64/v8, arm/v7`
|
||||
@ -156,7 +157,7 @@ programming in Python.
|
||||
- Force start to upload or download or both from queue using cmds or args once you add the download (task option)
|
||||
- Shell and Executor
|
||||
- Add sudo users
|
||||
- Ability to save upload Paths
|
||||
- Ability to save upload paths
|
||||
- Name Substitution to rename the files before upload
|
||||
- Supported Direct links Generators:
|
||||
|
||||
@ -244,7 +245,7 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
- `CMD_SUFFIX`: Commands index number. This number will added at the end all commands. `Str`|`Int`
|
||||
- `AUTHORIZED_CHATS`: Fill user_id and chat_id of groups/users you want to authorize. To auth only specific topic(s) write it in this format `chat_id|thread_id` Ex:-100XXXXXXXXXXX|10 or Ex:-100XXXXXXXXXXX|10|12. Separate them by space. `Int`
|
||||
- `SUDO_USERS`: Fill user_id of users whom you want to give sudo permission. Separate them by space. `Int`
|
||||
- `DEFAULT_UPLOAD`: Whether `rc` to upload to `RCLONE_PATH` or `gd` to upload to `GDRIVE_ID`. Default is `gd`. Read
|
||||
- `DEFAULT_UPLOAD`: Whether `rc` to upload to `RCLONE_PATH` or `gd` to upload to `GDRIVE_ID`. Default is `rc`. Read
|
||||
More [HERE](https://github.com/anasty17/mirror-leech-telegram-bot/tree/master#upload).`Str`
|
||||
- `STATUS_UPDATE_INTERVAL`: Time in seconds after which the progress/status message will be updated. Recommended `10`
|
||||
seconds at least. `Int`
|
||||
@ -265,6 +266,17 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
- `USE_SERVICE_ACCOUNTS`: Whether to use Service Accounts or not, with google-api-python-client. For this to work
|
||||
see [Using Service Accounts](https://github.com/anasty17/mirror-leech-telegram-bot#generate-service-accounts-what-is-service-account)
|
||||
section below. Default is `False`. `Bool`
|
||||
- `FFMPEG_CMDS`: list of lists of ffmpeg commands. You can set multiple ffmpeg commands for all files before upload. Don't write ffmpeg at beginning, start directly with the arguments. `list`
|
||||
- Examples: [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv"], ["-i", "mltb.video", "-c", "copy", "-c:s", "srt", "mltb"], ["-i", "mltb.m4a", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"], ["-i", "mltb.audio", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"]]
|
||||
**Notes**:
|
||||
- Add `-del` to the list(s) which you want from the bot to delete the original files after command run complete!
|
||||
- Seed will get disbaled while using this option
|
||||
**Example**:
|
||||
- Here I will explain how to use mltb.* which is reference to files you want to work on.
|
||||
1. First cmd: the input is mltb.mkv so this cmd will work only on mkv videos and the output is mltb.mkv also so all outputs is mkv.
|
||||
2. Second cmd: the input is mltb.video so this cmd will work on all videos and the output is only mltb so the extenstion is same as input files.
|
||||
3. Third cmd: the input in mltb.m4a so this cmd will work only on m4a audios and the output is mltb.mp3 so the output extension is mp3.
|
||||
4. Fourth cmd: the input is mltb.audio so this cmd will work on all audios and the output is mltb.mp3 so the output extension is mp3.
|
||||
- `NAME_SUBSTITUTE`: Add word/letter/character/sentense/pattern to remove or replace with other words with sensitive case or without. **Notes**:
|
||||
1. Seed will get disbaled while using this option
|
||||
2. Before any character you must add `\BACKSLASH`, those are the characters: `\^$.|?*+()[]{}-`
|
||||
@ -338,9 +350,9 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
|
||||
**8. JDownloader**
|
||||
|
||||
- `JD_EMAIL`: jdownlaoder email sign up on [JDownloader](https://my.jdownloader.org/)
|
||||
- `JD_PASS`: jdownlaoder password
|
||||
- **JDownloader Config**: You can use your config from local device to bot by *zipping* cfg folder (cfg.zip) and add it in repo folder but *before zip* you must change the downloads directory to `/root/Downloads`.
|
||||
- `JD_EMAIL`: jdownloader email sign up on [JDownloader](https://my.jdownloader.org/)
|
||||
- `JD_PASS`: jdownloader password
|
||||
- **JDownloader Config**: You can use your config from local machine in bot by *zipping* cfg folder (cfg.zip) and add it in repo folder.
|
||||
|
||||
**9. Sabnzbd**
|
||||
|
||||
@ -350,7 +362,7 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
|
||||
- [READ THIS FOR MORE INFORMATION](https://sabnzbd.org/wiki/configuration/4.2/servers)
|
||||
|
||||
- Open port 8070 in your vps to access full web interface from any device. Use it like http://ip:8070/sabnzbd/.
|
||||
- Open port 8070 in your vps to access full web interface from any device. Use it like http://ip:8070/sabnzbd/. username: mltb, password: mltbmltb
|
||||
|
||||
**10. RSS**
|
||||
|
||||
|
@ -76,6 +76,7 @@ multi_tags = set()
|
||||
try:
|
||||
if bool(environ.get("_____REMOVE_THIS_LINE_____")):
|
||||
log_error("The README.md file there to be read! Exiting now!")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
except:
|
||||
pass
|
||||
@ -95,6 +96,7 @@ rss_dict = {}
|
||||
BOT_TOKEN = environ.get("BOT_TOKEN", "")
|
||||
if len(BOT_TOKEN) == 0:
|
||||
log_error("BOT_TOKEN variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
|
||||
BOT_ID = BOT_TOKEN.split(":", 1)[0]
|
||||
@ -170,6 +172,7 @@ run(
|
||||
OWNER_ID = environ.get("OWNER_ID", "")
|
||||
if len(OWNER_ID) == 0:
|
||||
log_error("OWNER_ID variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
else:
|
||||
OWNER_ID = int(OWNER_ID)
|
||||
@ -177,6 +180,7 @@ else:
|
||||
TELEGRAM_API = environ.get("TELEGRAM_API", "")
|
||||
if len(TELEGRAM_API) == 0:
|
||||
log_error("TELEGRAM_API variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
else:
|
||||
TELEGRAM_API = int(TELEGRAM_API)
|
||||
@ -184,6 +188,7 @@ else:
|
||||
TELEGRAM_HASH = environ.get("TELEGRAM_HASH", "")
|
||||
if len(TELEGRAM_HASH) == 0:
|
||||
log_error("TELEGRAM_HASH variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
|
||||
USER_SESSION_STRING = environ.get("USER_SESSION_STRING", "")
|
||||
@ -220,8 +225,8 @@ if len(RCLONE_FLAGS) == 0:
|
||||
RCLONE_FLAGS = ""
|
||||
|
||||
DEFAULT_UPLOAD = environ.get("DEFAULT_UPLOAD", "")
|
||||
if DEFAULT_UPLOAD != "rc":
|
||||
DEFAULT_UPLOAD = "gd"
|
||||
if DEFAULT_UPLOAD != "gd":
|
||||
DEFAULT_UPLOAD = "rc"
|
||||
|
||||
DOWNLOAD_DIR = environ.get("DOWNLOAD_DIR", "")
|
||||
if len(DOWNLOAD_DIR) == 0:
|
||||
@ -420,6 +425,13 @@ MIXED_LEECH = MIXED_LEECH.lower() == "true" and IS_PREMIUM_USER
|
||||
THUMBNAIL_LAYOUT = environ.get("THUMBNAIL_LAYOUT", "")
|
||||
THUMBNAIL_LAYOUT = "" if len(THUMBNAIL_LAYOUT) == 0 else THUMBNAIL_LAYOUT
|
||||
|
||||
FFMPEG_CMDS = environ.get("FFMPEG_CMDS", "")
|
||||
try:
|
||||
FFMPEG_CMDS = [] if len(FFMPEG_CMDS) == 0 else eval(FFMPEG_CMDS)
|
||||
except:
|
||||
log_error(f"Wrong FFMPEG_CMDS format: {FFMPEG_CMDS}")
|
||||
FFMPEG_CMDS = []
|
||||
|
||||
config_dict = {
|
||||
"AS_DOCUMENT": AS_DOCUMENT,
|
||||
"AUTHORIZED_CHATS": AUTHORIZED_CHATS,
|
||||
@ -432,6 +444,7 @@ config_dict = {
|
||||
"DOWNLOAD_DIR": DOWNLOAD_DIR,
|
||||
"EQUAL_SPLITS": EQUAL_SPLITS,
|
||||
"EXTENSION_FILTER": EXTENSION_FILTER,
|
||||
"FFMPEG_CMDS": FFMPEG_CMDS,
|
||||
"FILELION_API": FILELION_API,
|
||||
"GDRIVE_ID": GDRIVE_ID,
|
||||
"INCOMPLETE_TASK_NOTIFIER": INCOMPLETE_TASK_NOTIFIER,
|
||||
|
@ -44,6 +44,7 @@ from .ext_utils.media_utils import (
|
||||
create_thumb,
|
||||
create_sample_video,
|
||||
take_ss,
|
||||
run_ffmpeg_cmd,
|
||||
)
|
||||
from .ext_utils.media_utils import (
|
||||
split_file,
|
||||
@ -62,6 +63,7 @@ from .mirror_leech_utils.status_utils.media_convert_status import (
|
||||
)
|
||||
from .mirror_leech_utils.status_utils.split_status import SplitStatus
|
||||
from .mirror_leech_utils.status_utils.zip_status import ZipStatus
|
||||
from .mirror_leech_utils.status_utils.ffmpeg_status import FFmpegStatus
|
||||
from .telegram_helper.bot_commands import BotCommands
|
||||
from .telegram_helper.message_utils import (
|
||||
send_message,
|
||||
@ -119,6 +121,7 @@ class TaskConfig:
|
||||
self.is_torrent = False
|
||||
self.as_med = False
|
||||
self.as_doc = False
|
||||
self.ffmpeg_cmds = None
|
||||
self.chat_thread_id = None
|
||||
self.suproc = None
|
||||
self.thumb = None
|
||||
@ -126,6 +129,10 @@ class TaskConfig:
|
||||
self.is_super_chat = self.message.chat.type.name in ["SUPERGROUP", "CHANNEL"]
|
||||
|
||||
def get_token_path(self, dest):
|
||||
if not dest.startswith(("mtp:", "tp:", "sa:")) and self.user_dict.get(
|
||||
"user_tokens", False
|
||||
):
|
||||
return f"tokens/{self.user_id}.pickle"
|
||||
if dest.startswith("mtp:"):
|
||||
return f"tokens/{self.user_id}.pickle"
|
||||
elif (
|
||||
@ -138,6 +145,8 @@ class TaskConfig:
|
||||
return "token.pickle"
|
||||
|
||||
def get_config_path(self, dest):
|
||||
if not dest.startswith("mrcc:") and self.user_dict.get("user_tokens", False):
|
||||
return f"rclone/{self.user_id}.conf"
|
||||
return (
|
||||
f"rclone/{self.user_id}.conf" if dest.startswith("mrcc:") else "rclone.conf"
|
||||
)
|
||||
@ -386,6 +395,18 @@ class TaskConfig:
|
||||
await create_thumb(msg) if msg.photo or msg.document else ""
|
||||
)
|
||||
|
||||
self.ffmpeg_cmds = (
|
||||
self.ffmpeg_cmds
|
||||
or self.user_dict.get("ffmpeg_cmds", None)
|
||||
or (
|
||||
config_dict["FFMPEG_CMDS"]
|
||||
if "ffmpeg_cmds" not in self.user_dict
|
||||
else None
|
||||
)
|
||||
)
|
||||
if self.ffmpeg_cmds:
|
||||
self.seed = False
|
||||
|
||||
async def get_tag(self, text: list):
|
||||
if len(text) > 1 and text[1].startswith("Tag: "):
|
||||
user_info = text[1].split("Tag: ")
|
||||
@ -828,25 +849,25 @@ class TaskConfig:
|
||||
self, dl_path, sample_duration, part_duration
|
||||
)
|
||||
if res:
|
||||
newfolder = ospath.splitext(dl_path)[0]
|
||||
new_folder = ospath.splitext(dl_path)[0]
|
||||
name = dl_path.rsplit("/", 1)[1]
|
||||
if self.seed and not self.new_dir:
|
||||
if self.is_leech and not self.compress:
|
||||
return self.dir
|
||||
self.new_dir = f"{self.dir}10000"
|
||||
newfolder = newfolder.replace(self.dir, self.new_dir)
|
||||
await makedirs(newfolder, exist_ok=True)
|
||||
new_folder = new_folder.replace(self.dir, self.new_dir)
|
||||
await makedirs(new_folder, exist_ok=True)
|
||||
await gather(
|
||||
copy2(dl_path, f"{newfolder}/{name}"),
|
||||
move(res, f"{newfolder}/SAMPLE.{name}"),
|
||||
copy2(dl_path, f"{new_folder}/{name}"),
|
||||
move(res, f"{new_folder}/SAMPLE.{name}"),
|
||||
)
|
||||
else:
|
||||
await makedirs(newfolder, exist_ok=True)
|
||||
await makedirs(new_folder, exist_ok=True)
|
||||
await gather(
|
||||
move(dl_path, f"{newfolder}/{name}"),
|
||||
move(res, f"{newfolder}/SAMPLE.{name}"),
|
||||
move(dl_path, f"{new_folder}/{name}"),
|
||||
move(res, f"{new_folder}/SAMPLE.{name}"),
|
||||
)
|
||||
return newfolder
|
||||
return new_folder
|
||||
else:
|
||||
for dirpath, _, files in await sync_to_async(walk, dl_path, topdown=False):
|
||||
for file_ in files:
|
||||
@ -859,7 +880,8 @@ class TaskConfig:
|
||||
await cpu_eater_lock.acquire()
|
||||
LOGGER.info(f"Creating Sample videos: {self.name}")
|
||||
if self.is_cancelled:
|
||||
cpu_eater_lock.release()
|
||||
if checked:
|
||||
cpu_eater_lock.release()
|
||||
return ""
|
||||
res = await create_sample_video(
|
||||
self, f_path, sample_duration, part_duration
|
||||
@ -982,7 +1004,8 @@ class TaskConfig:
|
||||
for dirpath, _, files in await sync_to_async(walk, dl_path, topdown=False):
|
||||
for file_ in files:
|
||||
if self.is_cancelled:
|
||||
cpu_eater_lock.release()
|
||||
if checked:
|
||||
cpu_eater_lock.release()
|
||||
return ""
|
||||
f_path = ospath.join(dirpath, file_)
|
||||
res = await proceed_convert(f_path)
|
||||
@ -1008,25 +1031,23 @@ class TaskConfig:
|
||||
LOGGER.info(f"Creating Screenshot for: {dl_path}")
|
||||
res = await take_ss(dl_path, ss_nb)
|
||||
if res:
|
||||
newfolder = ospath.splitext(dl_path)[0]
|
||||
new_folder = ospath.splitext(dl_path)[0]
|
||||
name = dl_path.rsplit("/", 1)[1]
|
||||
if self.seed and not self.new_dir:
|
||||
if self.is_leech and not self.compress:
|
||||
return self.dir
|
||||
await makedirs(newfolder, exist_ok=True)
|
||||
self.new_dir = f"{self.dir}10000"
|
||||
newfolder = newfolder.replace(self.dir, self.new_dir)
|
||||
new_folder = new_folder.replace(self.dir, self.new_dir)
|
||||
await makedirs(new_folder, exist_ok=True)
|
||||
await gather(
|
||||
copy2(dl_path, f"{newfolder}/{name}"),
|
||||
move(res, newfolder),
|
||||
copy2(dl_path, f"{new_folder}/{name}"),
|
||||
move(res, new_folder),
|
||||
)
|
||||
else:
|
||||
await makedirs(newfolder, exist_ok=True)
|
||||
await makedirs(new_folder, exist_ok=True)
|
||||
await gather(
|
||||
move(dl_path, f"{newfolder}/{name}"),
|
||||
move(res, newfolder),
|
||||
move(dl_path, f"{new_folder}/{name}"),
|
||||
move(res, new_folder),
|
||||
)
|
||||
return newfolder
|
||||
return new_folder
|
||||
else:
|
||||
LOGGER.info(f"Creating Screenshot for: {dl_path}")
|
||||
for dirpath, _, files in await sync_to_async(walk, dl_path, topdown=False):
|
||||
@ -1055,7 +1076,9 @@ class TaskConfig:
|
||||
try:
|
||||
name = sub(rf"{pattern}", res, name, flags=I if sen else 0)
|
||||
except Exception as e:
|
||||
LOGGER.error(f"Substitute Error: pattern: {pattern} res: {res}. Errro: {e}")
|
||||
LOGGER.error(
|
||||
f"Substitute Error: pattern: {pattern} res: {res}. Errro: {e}"
|
||||
)
|
||||
return dl_path
|
||||
if len(name.encode()) > 255:
|
||||
LOGGER.error(f"Substitute: {name} is too long")
|
||||
@ -1081,12 +1104,93 @@ class TaskConfig:
|
||||
else:
|
||||
res = ""
|
||||
try:
|
||||
file_ = sub(rf"{pattern}", res, file_, flags=I if sen else 0)
|
||||
file_ = sub(
|
||||
rf"{pattern}", res, file_, flags=I if sen else 0
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(f"Substitute Error: pattern: {pattern} res: {res}. Errro: {e}")
|
||||
LOGGER.error(
|
||||
f"Substitute Error: pattern: {pattern} res: {res}. Errro: {e}"
|
||||
)
|
||||
continue
|
||||
if len(file_.encode()) > 255:
|
||||
LOGGER.error(f"Substitute: {file_} is too long")
|
||||
continue
|
||||
await move(f_path, ospath.join(dirpath, file_))
|
||||
return dl_path
|
||||
|
||||
async def proceed_ffmpeg(self, dl_path, gid):
|
||||
checked = False
|
||||
for ffmpeg_cmd in self.ffmpeg_cmds:
|
||||
if "-del" in ffmpeg_cmd:
|
||||
ffmpeg_cmd.remove("-del")
|
||||
delete_files = True
|
||||
else:
|
||||
delete_files = False
|
||||
ffmpeg_cmd.insert(0, "ffmpeg")
|
||||
index = ffmpeg_cmd.index("-i")
|
||||
input_file = ffmpeg_cmd[index + 1]
|
||||
if input_file.endswith(".video"):
|
||||
ext = "video"
|
||||
elif input_file.endswith(".audio"):
|
||||
ext = "audio"
|
||||
elif "." not in input_file:
|
||||
ext = "all"
|
||||
else:
|
||||
ext = ospath.splitext(input_file)[-1]
|
||||
if await aiopath.isfile(dl_path):
|
||||
is_video, is_audio, _ = await get_document_type(dl_path)
|
||||
if not is_video and not is_audio:
|
||||
break
|
||||
elif is_video and ext == "audio":
|
||||
break
|
||||
elif is_audio and ext == "video":
|
||||
break
|
||||
elif ext != "all" and not dl_path.endswith(ext):
|
||||
break
|
||||
new_folder = ospath.splitext(dl_path)[0]
|
||||
name = dl_path.rsplit("/", 1)[1]
|
||||
await makedirs(new_folder, exist_ok=True)
|
||||
file_path = f"{new_folder}/{name}"
|
||||
await move(dl_path, file_path)
|
||||
dl_path = new_folder
|
||||
if not checked:
|
||||
checked = True
|
||||
async with task_dict_lock:
|
||||
task_dict[self.mid] = FFmpegStatus(self, gid)
|
||||
await cpu_eater_lock.acquire()
|
||||
LOGGER.info(f"Running ffmpeg cmd for: {file_path}")
|
||||
ffmpeg_cmd[index + 1] = file_path
|
||||
res = await run_ffmpeg_cmd(self, ffmpeg_cmd, file_path)
|
||||
if res and delete_files:
|
||||
await remove(file_path)
|
||||
else:
|
||||
for dirpath, _, files in await sync_to_async(
|
||||
walk, dl_path, topdown=False
|
||||
):
|
||||
for file_ in files:
|
||||
if self.is_cancelled:
|
||||
cpu_eater_lock.release()
|
||||
return ""
|
||||
f_path = ospath.join(dirpath, file_)
|
||||
is_video, is_audio, _ = await get_document_type(f_path)
|
||||
if not is_video and not is_audio:
|
||||
continue
|
||||
elif is_video and ext == "audio":
|
||||
continue
|
||||
elif is_audio and ext == "video":
|
||||
continue
|
||||
elif ext != "all" and not f_path.endswith(ext):
|
||||
continue
|
||||
ffmpeg_cmd[index + 1] = f_path
|
||||
if not checked:
|
||||
checked = True
|
||||
async with task_dict_lock:
|
||||
task_dict[self.mid] = FFmpegStatus(self, gid)
|
||||
await cpu_eater_lock.acquire()
|
||||
LOGGER.info(f"Running ffmpeg cmd for: {f_path}")
|
||||
res = await run_ffmpeg_cmd(self, ffmpeg_cmd, f_path)
|
||||
if res and delete_files:
|
||||
await remove(f_path)
|
||||
if checked:
|
||||
cpu_eater_lock.release()
|
||||
return dl_path
|
||||
|
@ -104,7 +104,7 @@ def arg_parser(items, arg_base):
|
||||
"-sync",
|
||||
"-ml",
|
||||
"-doc",
|
||||
"-med"
|
||||
"-med",
|
||||
}
|
||||
t = len(items)
|
||||
i = 0
|
||||
@ -118,7 +118,8 @@ def arg_parser(items, arg_base):
|
||||
if (
|
||||
i + 1 == t
|
||||
and part in bool_arg_set
|
||||
or part in ["-s", "-j", "-f", "-fd", "-fu", "-sync", "-ml", "-doc", "-med"]
|
||||
or part
|
||||
in ["-s", "-j", "-f", "-fd", "-fu", "-sync", "-ml", "-doc", "-med"]
|
||||
):
|
||||
arg_base[part] = True
|
||||
else:
|
||||
|
@ -69,7 +69,7 @@ If DEFAULT_UPLOAD is `rc` then you can pass up: `gd` to upload using gdrive tool
|
||||
If DEFAULT_UPLOAD is `gd` then you can pass up: `rc` to upload to RCLONE_PATH.
|
||||
|
||||
If you want to add path or gdrive manually from your config/token (UPLOADED FROM USETTING) add mrcc: for rclone and mtp: before the path/gdrive_id without space.
|
||||
/cmd link -up mrcc:main:dump or -up mtp:gdrive_id
|
||||
/cmd link -up mrcc:main:dump or -up mtp:gdrive_id <strong>or you can simply edit upload using owner/user token/config from usetting without adding mtp: or mrcc: before the upload path/id</strong>
|
||||
|
||||
To add leech destination:
|
||||
-up id
|
||||
|
@ -709,3 +709,38 @@ async def create_sample_video(listener, video_file, sample_duration, part_durati
|
||||
return False
|
||||
await gather(remove(segments_file), rmtree(f"{dir}/mltb_segments"))
|
||||
return output_file"""
|
||||
|
||||
|
||||
async def run_ffmpeg_cmd(listener, ffmpeg, path):
|
||||
base_name, ext = ospath.splitext(path)
|
||||
output_file = ffmpeg[-1]
|
||||
if output_file != "mltb" and output_file.startswith("mltb"):
|
||||
ext = ospath.splitext(output_file)[-1]
|
||||
else:
|
||||
base_name = f"ffmpeg - {base_name}"
|
||||
output = f"{base_name}{ext}"
|
||||
ffmpeg[-1] = output
|
||||
if listener.is_cancelled:
|
||||
return False
|
||||
async with subprocess_lock:
|
||||
listener.suproc = await create_subprocess_exec(*ffmpeg, stderr=PIPE)
|
||||
_, stderr = await listener.suproc.communicate()
|
||||
if listener.is_cancelled:
|
||||
return False
|
||||
code = listener.suproc.returncode
|
||||
if code == 0:
|
||||
return output
|
||||
elif code == -9:
|
||||
listener.is_cancelled = True
|
||||
return False
|
||||
else:
|
||||
try:
|
||||
stderr = stderr.decode().strip()
|
||||
except:
|
||||
stderr = "Unable to decode the error!"
|
||||
LOGGER.error(
|
||||
f"{stderr}. Something went wrong while running ffmpeg cmd, mostly file requires different/specific arguments. Path: {path}"
|
||||
)
|
||||
if await aiopath.exists(output):
|
||||
await remove(output)
|
||||
return False
|
||||
|
@ -31,6 +31,7 @@ class MirrorStatus:
|
||||
STATUS_SEEDING = "Seed"
|
||||
STATUS_SAMVID = "SamVid"
|
||||
STATUS_CONVERTING = "Convert"
|
||||
STATUS_FFMPEG = "FFmpeg"
|
||||
|
||||
|
||||
STATUSES = {
|
||||
@ -43,10 +44,11 @@ STATUSES = {
|
||||
"EX": MirrorStatus.STATUS_EXTRACTING,
|
||||
"SD": MirrorStatus.STATUS_SEEDING,
|
||||
"CM": MirrorStatus.STATUS_CONVERTING,
|
||||
"CL": MirrorStatus.STATUS_CLONING,
|
||||
"SP": MirrorStatus.STATUS_SPLITTING,
|
||||
"CK": MirrorStatus.STATUS_CHECKING,
|
||||
"SV": MirrorStatus.STATUS_SAMVID,
|
||||
"FF": MirrorStatus.STATUS_FFMPEG,
|
||||
"CL": MirrorStatus.STATUS_CLONING,
|
||||
"PA": MirrorStatus.STATUS_PAUSED,
|
||||
}
|
||||
|
||||
@ -118,8 +120,22 @@ def get_readable_time(seconds: int):
|
||||
|
||||
|
||||
def time_to_seconds(time_duration):
|
||||
hours, minutes, seconds = map(int, time_duration.split(":"))
|
||||
return hours * 3600 + minutes * 60 + seconds
|
||||
try:
|
||||
parts = time_duration.split(":")
|
||||
if len(parts) == 3:
|
||||
hours, minutes, seconds = map(int, parts)
|
||||
elif len(parts) == 2:
|
||||
hours = 0
|
||||
minutes, seconds = map(int, parts)
|
||||
elif len(parts) == 1:
|
||||
hours = 0
|
||||
minutes = 0
|
||||
seconds = int(parts[0])
|
||||
else:
|
||||
return 0
|
||||
return hours * 3600 + minutes * 60 + seconds
|
||||
except ValueError as e:
|
||||
return 0
|
||||
|
||||
|
||||
def speed_string_to_bytes(size_text: str):
|
||||
@ -178,6 +194,7 @@ async def get_readable_message(sid, is_user, page_no=1, status="All", page_step=
|
||||
MirrorStatus.STATUS_SEEDING,
|
||||
MirrorStatus.STATUS_SAMVID,
|
||||
MirrorStatus.STATUS_CONVERTING,
|
||||
MirrorStatus.STATUS_FFMPEG,
|
||||
MirrorStatus.STATUS_QUEUEUP,
|
||||
]:
|
||||
progress = (
|
||||
|
@ -27,38 +27,38 @@ async def _remove_job(nzo_id, mid):
|
||||
|
||||
@new_task
|
||||
async def _on_download_error(err, nzo_id, button=None):
|
||||
task = await get_task_by_gid(nzo_id)
|
||||
LOGGER.info(f"Cancelling Download: {task.name()}")
|
||||
await gather(
|
||||
task.listener.on_download_error(err, button),
|
||||
_remove_job(nzo_id, task.listener.mid),
|
||||
)
|
||||
if task := await get_task_by_gid(nzo_id):
|
||||
LOGGER.info(f"Cancelling Download: {task.name()}")
|
||||
await gather(
|
||||
task.listener.on_download_error(err, button),
|
||||
_remove_job(nzo_id, task.listener.mid),
|
||||
)
|
||||
|
||||
|
||||
@new_task
|
||||
async def _change_status(nzo_id, status):
|
||||
task = await get_task_by_gid(nzo_id)
|
||||
async with task_dict_lock:
|
||||
task.cstatus = status
|
||||
if task := await get_task_by_gid(nzo_id):
|
||||
async with task_dict_lock:
|
||||
task.cstatus = status
|
||||
|
||||
|
||||
@new_task
|
||||
async def _stop_duplicate(nzo_id):
|
||||
task = await get_task_by_gid(nzo_id)
|
||||
await task.update()
|
||||
task.listener.name = task.name()
|
||||
msg, button = await stop_duplicate_check(task.listener)
|
||||
if msg:
|
||||
_on_download_error(msg, nzo_id, button)
|
||||
if task := await get_task_by_gid(nzo_id):
|
||||
await task.update()
|
||||
task.listener.name = task.name()
|
||||
msg, button = await stop_duplicate_check(task.listener)
|
||||
if msg:
|
||||
_on_download_error(msg, nzo_id, button)
|
||||
|
||||
|
||||
@new_task
|
||||
async def _on_download_complete(nzo_id):
|
||||
task = await get_task_by_gid(nzo_id)
|
||||
await task.listener.on_download_complete()
|
||||
if intervals["stopAll"]:
|
||||
return
|
||||
await _remove_job(nzo_id, task.listener.mid)
|
||||
if task := await get_task_by_gid(nzo_id):
|
||||
await task.listener.on_download_complete()
|
||||
if intervals["stopAll"]:
|
||||
return
|
||||
await _remove_job(nzo_id, task.listener.mid)
|
||||
|
||||
|
||||
@new_task
|
||||
@ -96,6 +96,9 @@ async def _nzb_listener():
|
||||
nzo_id = dl["nzo_id"]
|
||||
if nzo_id not in nzb_jobs:
|
||||
continue
|
||||
if dl["labels"] and dl["labels"][0] == "ALTERNATIVE":
|
||||
await _on_download_error("Duplicated Job!", nzo_id)
|
||||
continue
|
||||
if (
|
||||
dl["status"] == "Downloading"
|
||||
and not nzb_jobs[nzo_id]["stop_dup_check"]
|
||||
|
@ -178,6 +178,15 @@ class TaskListener(TaskConfig):
|
||||
if self.is_cancelled:
|
||||
return
|
||||
self.name = up_path.rsplit("/", 1)[1]
|
||||
if self.ffmpeg_cmds:
|
||||
up_path = await self.proceed_ffmpeg(
|
||||
up_path,
|
||||
gid,
|
||||
)
|
||||
if self.is_cancelled:
|
||||
return
|
||||
up_dir, self.name = up_path.rsplit("/", 1)
|
||||
self.size = await get_path_size(up_dir)
|
||||
|
||||
if self.screen_shots:
|
||||
up_path = await self.generate_screenshots(up_path)
|
||||
@ -188,7 +197,11 @@ class TaskListener(TaskConfig):
|
||||
|
||||
if self.convert_audio or self.convert_video:
|
||||
up_path = await self.convert_media(
|
||||
up_path, gid, unwanted_files, unwanted_files_size, files_to_delete
|
||||
up_path,
|
||||
gid,
|
||||
unwanted_files,
|
||||
unwanted_files_size,
|
||||
files_to_delete,
|
||||
)
|
||||
if self.is_cancelled:
|
||||
return
|
||||
|
35
bot/helper/mirror_leech_utils/status_utils/ffmpeg_status.py
Normal file
35
bot/helper/mirror_leech_utils/status_utils/ffmpeg_status.py
Normal file
@ -0,0 +1,35 @@
|
||||
from bot import LOGGER, subprocess_lock
|
||||
from ...ext_utils.status_utils import get_readable_file_size, MirrorStatus
|
||||
|
||||
|
||||
class FFmpegStatus:
|
||||
def __init__(self, listener, gid):
|
||||
self.listener = listener
|
||||
self._gid = gid
|
||||
self._size = self.listener.size
|
||||
|
||||
def gid(self):
|
||||
return self._gid
|
||||
|
||||
def name(self):
|
||||
return self.listener.name
|
||||
|
||||
def size(self):
|
||||
return get_readable_file_size(self._size)
|
||||
|
||||
def status(self):
|
||||
return MirrorStatus.STATUS_FFMPEG
|
||||
|
||||
def task(self):
|
||||
return self
|
||||
|
||||
async def cancel_task(self):
|
||||
LOGGER.info(f"Cancelling ffmpeg_cmd: {self.listener.name}")
|
||||
self.listener.is_cancelled = True
|
||||
async with subprocess_lock:
|
||||
if (
|
||||
self.listener.suproc is not None
|
||||
and self.listener.suproc.returncode is None
|
||||
):
|
||||
self.listener.suproc.kill()
|
||||
await self.listener.on_upload_error("ffmpeg cmd stopped by user!")
|
@ -933,8 +933,8 @@ async def load_config():
|
||||
RCLONE_PATH = ""
|
||||
|
||||
DEFAULT_UPLOAD = environ.get("DEFAULT_UPLOAD", "")
|
||||
if DEFAULT_UPLOAD != "rc":
|
||||
DEFAULT_UPLOAD = "gd"
|
||||
if DEFAULT_UPLOAD != "gd":
|
||||
DEFAULT_UPLOAD = "rc"
|
||||
|
||||
RCLONE_FLAGS = environ.get("RCLONE_FLAGS", "")
|
||||
if len(RCLONE_FLAGS) == 0:
|
||||
@ -1157,6 +1157,13 @@ async def load_config():
|
||||
THUMBNAIL_LAYOUT = environ.get("THUMBNAIL_LAYOUT", "")
|
||||
THUMBNAIL_LAYOUT = "" if len(THUMBNAIL_LAYOUT) == 0 else THUMBNAIL_LAYOUT
|
||||
|
||||
FFMPEG_CMDS = environ.get("FFMPEG_CMDS", "")
|
||||
try:
|
||||
FFMPEG_CMDS = [] if len(FFMPEG_CMDS) == 0 else eval(FFMPEG_CMDS)
|
||||
except:
|
||||
LOGGER.error(f"Wrong FFMPEG_CMDS format: {FFMPEG_CMDS}")
|
||||
FFMPEG_CMDS = []
|
||||
|
||||
await (await create_subprocess_exec("pkill", "-9", "-f", "gunicorn")).wait()
|
||||
BASE_URL = environ.get("BASE_URL", "").rstrip("/")
|
||||
if len(BASE_URL) == 0:
|
||||
@ -1208,6 +1215,7 @@ async def load_config():
|
||||
"DOWNLOAD_DIR": DOWNLOAD_DIR,
|
||||
"EQUAL_SPLITS": EQUAL_SPLITS,
|
||||
"EXTENSION_FILTER": EXTENSION_FILTER,
|
||||
"FFMPEG_CMDS": FFMPEG_CMDS,
|
||||
"FILELION_API": FILELION_API,
|
||||
"GDRIVE_ID": GDRIVE_ID,
|
||||
"INCOMPLETE_TASK_NOTIFIER": INCOMPLETE_TASK_NOTIFIER,
|
||||
|
@ -117,6 +117,7 @@ def create_cancel_buttons(is_sudo, user_id=""):
|
||||
buttons.data_button(
|
||||
"ConvertMedia", f"canall ms {MirrorStatus.STATUS_CONVERTING} {user_id}"
|
||||
)
|
||||
buttons.data_button("FFmpeg", f"canall ms {MirrorStatus.STATUS_FFMPEG} {user_id}")
|
||||
buttons.data_button("Paused", f"canall ms {MirrorStatus.STATUS_PAUSED} {user_id}")
|
||||
buttons.data_button("All", f"canall ms All {user_id}")
|
||||
if is_sudo:
|
||||
|
@ -109,6 +109,7 @@ class Mirror(TaskListener):
|
||||
"-cv": "",
|
||||
"-ns": "",
|
||||
"-tl": "",
|
||||
"-ff": "None",
|
||||
}
|
||||
|
||||
arg_parser(input_list[1:], args)
|
||||
@ -154,6 +155,12 @@ class Mirror(TaskListener):
|
||||
except:
|
||||
self.multi = 0
|
||||
|
||||
try:
|
||||
self.ffmpeg_cmds = eval(args["-ff"])
|
||||
except Exception as e:
|
||||
self.ffmpeg_cmds = None
|
||||
LOGGER.error(e)
|
||||
|
||||
if not isinstance(self.seed, bool):
|
||||
dargs = self.seed.split(":")
|
||||
ratio = dargs[0] or None
|
||||
|
@ -99,6 +99,7 @@ async def status_pages(_, query):
|
||||
"Pause": 0,
|
||||
"SamVid": 0,
|
||||
"ConvertMedia": 0,
|
||||
"FFmpeg": 0,
|
||||
}
|
||||
dl_speed = 0
|
||||
up_speed = 0
|
||||
@ -135,6 +136,8 @@ async def status_pages(_, query):
|
||||
tasks["SamVid"] += 1
|
||||
case MirrorStatus.STATUS_CONVERTING:
|
||||
tasks["ConvertMedia"] += 1
|
||||
case MirrorStatus.STATUS_FFMPEG:
|
||||
tasks["FFMPEG"] += 1
|
||||
case _:
|
||||
tasks["Download"] += 1
|
||||
dl_speed += speed_string_to_bytes(download.speed())
|
||||
@ -142,7 +145,7 @@ async def status_pages(_, query):
|
||||
msg = f"""<b>DL:</b> {tasks['Download']} | <b>UP:</b> {tasks['Upload']} | <b>SD:</b> {tasks['Seed']} | <b>AR:</b> {tasks['Archive']}
|
||||
<b>EX:</b> {tasks['Extract']} | <b>SP:</b> {tasks['Split']} | <b>QD:</b> {tasks['QueueDl']} | <b>QU:</b> {tasks['QueueUp']}
|
||||
<b>CL:</b> {tasks['Clone']} | <b>CK:</b> {tasks['CheckUp']} | <b>PA:</b> {tasks['Pause']} | <b>SV:</b> {tasks['SamVid']}
|
||||
<b>CM:</b> {tasks['ConvertMedia']}
|
||||
<b>CM:</b> {tasks['ConvertMedia']} <b>FF:</b> {tasks['FFmpeg']}
|
||||
|
||||
<b>ODLS:</b> {get_readable_file_size(dl_speed)}/s
|
||||
<b>OULS:</b> {get_readable_file_size(up_speed)}/s
|
||||
|
@ -156,8 +156,15 @@ async def get_user_settings(from_user):
|
||||
user_dict.get("default_upload", "") or config_dict["DEFAULT_UPLOAD"]
|
||||
)
|
||||
du = "Gdrive API" if default_upload == "gd" else "Rclone"
|
||||
dub = "Gdrive API" if default_upload != "gd" else "Rclone"
|
||||
buttons.data_button(f"Upload using {dub}", f"userset {user_id} {default_upload}")
|
||||
dur = "Gdrive API" if default_upload != "gd" else "Rclone"
|
||||
buttons.data_button(f"Upload using {dur}", f"userset {user_id} {default_upload}")
|
||||
|
||||
user_tokens = user_dict.get("user_tokens", False)
|
||||
tr = "MY" if user_tokens else "OWNER"
|
||||
trr = "OWNER" if user_tokens else "MY"
|
||||
buttons.data_button(
|
||||
f"Use {trr} token/config", f"userset {user_id} user_tokens {user_tokens}"
|
||||
)
|
||||
|
||||
buttons.data_button("Excluded Extensions", f"userset {user_id} ex_ex")
|
||||
if user_dict.get("excluded_extensions", False):
|
||||
@ -173,11 +180,19 @@ async def get_user_settings(from_user):
|
||||
buttons.data_button("YT-DLP Options", f"userset {user_id} yto")
|
||||
if user_dict.get("yt_opt", False):
|
||||
ytopt = user_dict["yt_opt"]
|
||||
elif "yt_opt" not in user_dict and (YTO := config_dict["YT_DLP_OPTIONS"]):
|
||||
ytopt = YTO
|
||||
elif "yt_opt" not in user_dict and config_dict["YT_DLP_OPTIONS"]:
|
||||
ytopt = config_dict["YT_DLP_OPTIONS"]
|
||||
else:
|
||||
ytopt = "None"
|
||||
|
||||
buttons.data_button("Ffmpeg Cmds", f"userset {user_id} ffc")
|
||||
if user_dict.get("ffmpeg_cmds", False):
|
||||
ffc = user_dict["ffmpeg_cmds"]
|
||||
elif "ffmpeg_cmds" not in user_dict and config_dict["FFMPEG_CMDS"]:
|
||||
ffc = config_dict["FFMPEG_CMDS"]
|
||||
else:
|
||||
ffc = "None"
|
||||
|
||||
if user_dict:
|
||||
buttons.data_button("Reset All", f"userset {user_id} reset")
|
||||
|
||||
@ -201,10 +216,12 @@ Upload Paths is <b>{upload_paths}</b>
|
||||
Gdrive ID is <code>{gdrive_id}</code>
|
||||
Index Link is <code>{index}</code>
|
||||
Stop Duplicate is <b>{sd_msg}</b>
|
||||
Default Upload is <b>{du}</b>
|
||||
Default Package is <b>{du}</b>
|
||||
Upload using <b>{tr}</b> token/config
|
||||
Name substitution is <b>{ns_msg}</b>
|
||||
Excluded Extensions is <code>{ex_ex}</code>
|
||||
YT-DLP Options is <b><code>{escape(ytopt)}</code></b>"""
|
||||
YT-DLP Options is <code>{escape(ytopt)}</code>
|
||||
FFMPEG Commands is <code>{ffc}</code>"""
|
||||
|
||||
return text, buttons.build_menu(1)
|
||||
|
||||
@ -309,6 +326,18 @@ async def set_option(_, message, pre_event, option):
|
||||
name, path = data
|
||||
user_dict["upload_paths"][name] = path
|
||||
value = user_dict["upload_paths"]
|
||||
elif option == "ffmpeg_cmds":
|
||||
if value.startswith("[") and value.endswith("]"):
|
||||
try:
|
||||
value = eval(value)
|
||||
except Exception as e:
|
||||
await send_message(message, str(e))
|
||||
await update_user_settings(pre_event)
|
||||
return
|
||||
else:
|
||||
await send_message(message, "It must be list of lists!")
|
||||
await update_user_settings(pre_event)
|
||||
return
|
||||
update_user_ldata(user_id, option, value)
|
||||
await delete_message(message)
|
||||
await update_user_settings(pre_event)
|
||||
@ -396,6 +425,7 @@ async def edit_user_settings(client, query):
|
||||
"excluded_extensions",
|
||||
"name_sub",
|
||||
"thumb_layout",
|
||||
"ffmpeg_cmds",
|
||||
]:
|
||||
await query.answer()
|
||||
update_user_ldata(user_id, data[2], "")
|
||||
@ -622,6 +652,28 @@ Check all yt-dlp api options from this <a href='https://github.com/yt-dlp/yt-dlp
|
||||
await edit_message(message, rmsg, buttons.build_menu(1))
|
||||
pfunc = partial(set_option, pre_event=query, option="yt_opt")
|
||||
await event_handler(client, query, pfunc)
|
||||
elif data[2] == "ffc":
|
||||
await query.answer()
|
||||
buttons = ButtonMaker()
|
||||
if user_dict.get("ffmpeg_cmds", False) or config_dict["YT_DLP_OPTIONS"]:
|
||||
buttons.data_button(
|
||||
"Remove YT-DLP Options", f"userset {user_id} ffmpeg_cmds", "header"
|
||||
)
|
||||
buttons.data_button("Back", f"userset {user_id} back")
|
||||
buttons.data_button("Close", f"userset {user_id} close")
|
||||
rmsg = """list of lists of ffmpeg commands. You can set multiple ffmpeg commands for all files before upload. Don't write ffmpeg at beginning, start directly with the arguments.
|
||||
Notes:
|
||||
1. Add <code>-del</code> to the list(s) which you want from the bot to delete the original files after command run complete!
|
||||
2. Seed will get disbaled while using this option
|
||||
Examples: [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv"], ["-i", "mltb.video", "-c", "copy", "-c:s", "srt", "mltb"], ["-i", "mltb.m4a", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"], ["-i", "mltb.audio", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"]]
|
||||
Here I will explain how to use mltb.* which is reference to files you want to work on.
|
||||
1. First cmd: the input is mltb.mkv so this cmd will work only on mkv videos and the output is mltb.mkv also so all outputs is mkv.
|
||||
2. Second cmd: the input is mltb.video so this cmd will work on all videos and the output is only mltb so the extenstion is same as input files.
|
||||
3. Third cmd: the input in mltb.m4a so this cmd will work only on m4a audios and the output is mltb.mp3 so the output extension is mp3.
|
||||
4. Fourth cmd: the input is mltb.audio so this cmd will work on all audios and the output is mltb.mp3 so the output extension is mp3."""
|
||||
await edit_message(message, rmsg, buttons.build_menu(1))
|
||||
pfunc = partial(set_option, pre_event=query, option="ffmpeg_cmds")
|
||||
await event_handler(client, query, pfunc)
|
||||
elif data[2] == "lss":
|
||||
await query.answer()
|
||||
buttons = ButtonMaker()
|
||||
@ -808,6 +860,13 @@ Example: script/code/s | mirror/leech | tea/ /s | clone | cpu/ | \[mltb\]/mltb |
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
elif data[2] == "user_tokens":
|
||||
await query.answer()
|
||||
tr = data[3].lower() == "false"
|
||||
update_user_ldata(user_id, "user_tokens", tr)
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
elif data[2] == "upload_paths":
|
||||
await query.answer()
|
||||
buttons = ButtonMaker()
|
||||
|
@ -311,6 +311,7 @@ class YtDlp(TaskListener):
|
||||
"-cv": "",
|
||||
"-ns": "",
|
||||
"-tl": "",
|
||||
"-ff": "None",
|
||||
}
|
||||
|
||||
arg_parser(input_list[1:], args)
|
||||
@ -320,6 +321,11 @@ class YtDlp(TaskListener):
|
||||
except:
|
||||
self.multi = 0
|
||||
|
||||
try:
|
||||
self.ffmpeg_cmds = eval(args["-ff"])
|
||||
except:
|
||||
self.ffmpeg_cmds = None
|
||||
|
||||
self.select = args["-s"]
|
||||
self.name = args["-n"]
|
||||
self.up_dest = args["-up"]
|
||||
|
@ -22,6 +22,7 @@ INCOMPLETE_TASK_NOTIFIER = "False"
|
||||
YT_DLP_OPTIONS = ""
|
||||
USE_SERVICE_ACCOUNTS = "False"
|
||||
NAME_SUBSTITUTE = ""
|
||||
FFMPEG_CMDS = ""
|
||||
# GDrive Tools
|
||||
GDRIVE_ID = ""
|
||||
IS_TEAM_DRIVE = "False"
|
||||
@ -38,7 +39,7 @@ RCLONE_SERVE_PASS = ""
|
||||
JD_EMAIL = ""
|
||||
JD_PASS = ""
|
||||
# Sabnzbd
|
||||
USENET_SERVERS = "[{'name': 'main', 'host': '', 'port': 5126, 'timeout': 60, 'username': '', 'password': '', 'connections': 8, 'ssl': 1, 'ssl_verify': 2, 'ssl_ciphers': '', 'enable': 1, 'required': 0, 'optional': 0, 'retention': 0, 'send_group': 0, 'priority': 0}]"
|
||||
USENET_SERVERS = "[{'name': 'main', 'host': '', 'port': 563, 'timeout': 60, 'username': '', 'password': '', 'connections': 8, 'ssl': 1, 'ssl_verify': 2, 'ssl_ciphers': '', 'enable': 1, 'required': 0, 'optional': 0, 'retention': 0, 'send_group': 0, 'priority': 0}]"
|
||||
# Update
|
||||
UPSTREAM_REPO = ""
|
||||
UPSTREAM_BRANCH = ""
|
||||
|
@ -1,7 +1,7 @@
|
||||
__version__ = 19
|
||||
__encoding__ = utf-8
|
||||
[misc]
|
||||
helpful_warnings = 1
|
||||
helpful_warnings = 0
|
||||
queue_complete = ""
|
||||
queue_complete_pers = 0
|
||||
bandwidth_perc = 100
|
||||
@ -16,12 +16,12 @@ sorters_converted = 1
|
||||
check_new_rel = 1
|
||||
auto_browser = 1
|
||||
language = en
|
||||
enable_https_verification = 1
|
||||
enable_https_verification = 0
|
||||
host = ::
|
||||
port = 8070
|
||||
https_port = ""
|
||||
username = ""
|
||||
password = ""
|
||||
username = "mltb"
|
||||
password = "mltbmltb"
|
||||
bandwidth_max = ""
|
||||
cache_limit = ""
|
||||
web_dir = Glitter
|
||||
@ -35,9 +35,9 @@ api_key = mltb
|
||||
nzb_key = ""
|
||||
socks5_proxy_url = ""
|
||||
permissions = ""
|
||||
download_dir = /usr/src/app/downloads/incomplete
|
||||
download_dir =
|
||||
download_free = ""
|
||||
complete_dir = /usr/src/app/downloads/complete
|
||||
complete_dir =
|
||||
complete_free = ""
|
||||
fulldisk_autoresume = 0
|
||||
script_dir = ""
|
||||
@ -116,7 +116,7 @@ enable_filejoin = 1
|
||||
enable_tsjoin = 1
|
||||
overwrite_files = 0
|
||||
ignore_unrar_dates = 0
|
||||
backup_for_duplicates = 1
|
||||
backup_for_duplicates = 0
|
||||
empty_postproc = 0
|
||||
wait_for_dfolder = 0
|
||||
rss_filenames = 0
|
||||
@ -128,7 +128,7 @@ tray_icon = 1
|
||||
allow_incomplete_nzb = 0
|
||||
enable_broadcast = 1
|
||||
ipv6_hosting = 0
|
||||
api_warnings = 1
|
||||
api_warnings = 0
|
||||
no_penalties = 0
|
||||
x_frame_options = 1
|
||||
allow_old_ssl_tls = 0
|
||||
|
@ -1,8 +1,5 @@
|
||||
class SubFunctions:
|
||||
|
||||
def __init__(self):
|
||||
pass
|
||||
|
||||
async def check_login(self):
|
||||
res = await self.get_config("servers")
|
||||
if res["config"]:
|
||||
|
@ -3,9 +3,6 @@ from sabnzbdapi.bound_methods import SubFunctions
|
||||
|
||||
class JobFunctions(SubFunctions):
|
||||
|
||||
def __init__(self):
|
||||
pass
|
||||
|
||||
async def add_uri(
|
||||
self,
|
||||
url: str = "",
|
||||
|
Loading…
Reference in New Issue
Block a user