mirror of
https://github.com/anasty17/mirror-leech-telegram-bot.git
synced 2025-01-09 04:47:34 +08:00
Ability to zip/unzip multi links in same directory. Mostly helpful in unziping tg file parts
- Revert media group only for parts. Signed-off-by: anasty17 <e.anastayyar@gmail.com>
This commit is contained in:
parent
e3e1965388
commit
2360113d5a
@ -25,7 +25,7 @@ In each single file there is a major change from base code, it's almost totaly d
|
|||||||
- 4GB file upload with premium account
|
- 4GB file upload with premium account
|
||||||
- Upload all files to specific superGroup/channel.
|
- Upload all files to specific superGroup/channel.
|
||||||
- Leech Split size and equal split size settings for each user
|
- Leech Split size and equal split size settings for each user
|
||||||
- Ability to leech files in media group. Setting for each user
|
- Ability to leech splitted file parts in media group. Setting for each user
|
||||||
### Google
|
### Google
|
||||||
- Stop duplicates for all tasks except yt-dlp tasks
|
- Stop duplicates for all tasks except yt-dlp tasks
|
||||||
- Download from Google Drive
|
- Download from Google Drive
|
||||||
@ -73,13 +73,14 @@ In each single file there is a major change from base code, it's almost totaly d
|
|||||||
- Docker image support for linux `amd64, arm64/v8, arm/v7, s390x`
|
- Docker image support for linux `amd64, arm64/v8, arm/v7, s390x`
|
||||||
- Edit variables and overwrite the private files while bot running
|
- Edit variables and overwrite the private files while bot running
|
||||||
- Update bot at startup and with restart command using `UPSTREAM_REPO`
|
- Update bot at startup and with restart command using `UPSTREAM_REPO`
|
||||||
- Improve Telegraph. Based on [Sreeraj](https://github.com/SVR666) loaderX-bot.
|
- Improve Telegraph. Based on [Sreeraj](https://github.com/SVR666) loaderX-bot
|
||||||
- Mirror/Leech/Watch/Clone/Count/Del by reply
|
- Mirror/Leech/Watch/Clone/Count/Del by reply
|
||||||
- Mirror/Leech/Clone multi links/files with one command
|
- Mirror/Leech/Clone multi links/files with one command
|
||||||
- Custom name for all links except torrents. For files you should add extension except yt-dlp links
|
- Custom name for all links except torrents. For files you should add extension except yt-dlp links
|
||||||
- Extensions Filter for the files to be uploaded/cloned
|
- Extensions Filter for the files to be uploaded/cloned
|
||||||
- View Link button. Extra button to open index link in broswer instead of direct download for file
|
- View Link button. Extra button to open index link in broswer instead of direct download for file
|
||||||
- Queueing System
|
- Queueing System
|
||||||
|
- Ability to zip/unzip multi links in same directory. Mostly helpful in unziping tg file parts
|
||||||
- Almost all repository functions have been improved and many other details can't mention all of them
|
- Almost all repository functions have been improved and many other details can't mention all of them
|
||||||
- Many bugs have been fixed
|
- Many bugs have been fixed
|
||||||
|
|
||||||
@ -182,7 +183,7 @@ Fill up rest of the fields. Meaning of each field is discussed below. **NOTE**:
|
|||||||
- `LEECH_SPLIT_SIZE`: Size of split in bytes. Default is `2GB`. Default is `4GB` if your account is premium. `Int`
|
- `LEECH_SPLIT_SIZE`: Size of split in bytes. Default is `2GB`. Default is `4GB` if your account is premium. `Int`
|
||||||
- `AS_DOCUMENT`: Default type of Telegram file upload. Default is `False` mean as media. `Bool`
|
- `AS_DOCUMENT`: Default type of Telegram file upload. Default is `False` mean as media. `Bool`
|
||||||
- `EQUAL_SPLITS`: Split files larger than **LEECH_SPLIT_SIZE** into equal parts size (Not working with zip cmd). Default is `False`. `Bool`
|
- `EQUAL_SPLITS`: Split files larger than **LEECH_SPLIT_SIZE** into equal parts size (Not working with zip cmd). Default is `False`. `Bool`
|
||||||
- `MEDIA_GROUP`: View Uploaded files in media group. Default is `False`. `Bool`.**NOTE**: Some files will end without any reply, it's hard to manage. Maybe in future i will fix it.
|
- `MEDIA_GROUP`: View Uploaded splitted file parts in media group. Default is `False`. `Bool`.
|
||||||
- `LEECH_FILENAME_PREFIX`: Add custom word to leeched file name. `Str`
|
- `LEECH_FILENAME_PREFIX`: Add custom word to leeched file name. `Str`
|
||||||
- `DUMP_CHAT`: Chat ID. Upload files to specific chat. `str`. **NOTE**: Only available for superGroup/channel. Add `-100` before channel/superGroup id. In short don't add bot id or your id!
|
- `DUMP_CHAT`: Chat ID. Upload files to specific chat. `str`. **NOTE**: Only available for superGroup/channel. Add `-100` before channel/superGroup id. In short don't add bot id or your id!
|
||||||
- `USER_SESSION_STRING`: To download/upload from your telegram account. If you own premium account. To generate session string use this command `python3 generate_string_session.py` after mounting repo folder for sure. `Str`. **NOTE**: You can't use bot with private message. Use it with superGroup.
|
- `USER_SESSION_STRING`: To download/upload from your telegram account. If you own premium account. To generate session string use this command `python3 generate_string_session.py` after mounting repo folder for sure. `Str`. **NOTE**: You can't use bot with private message. Use it with superGroup.
|
||||||
@ -411,7 +412,7 @@ python3 gen_sa_accounts.py --download-keys $PROJECTID
|
|||||||
```
|
```
|
||||||
>**NOTE:** 1 Service Account can upload/copy around 750 GB a day, 1 project can make 100 Service Accounts so you can upload 75 TB a day.
|
>**NOTE:** 1 Service Account can upload/copy around 750 GB a day, 1 project can make 100 Service Accounts so you can upload 75 TB a day.
|
||||||
|
|
||||||
>**NOTE:** All people can copy `2TB/DAY` from each file creator (uploader account), so if you got error `userRateLimitExceeded` that doesn't your limit exceeded but but file creator limit have been exceeded which is `2TB/DAY`.
|
>**NOTE:** All people can copy `2TB/DAY` from each file creator (uploader account), so if you got error `userRateLimitExceeded` that doesn't mean your limit exceeded but file creator limit have been exceeded which is `2TB/DAY`.
|
||||||
|
|
||||||
#### Two methods to create service accounts
|
#### Two methods to create service accounts
|
||||||
Choose one of these methods
|
Choose one of these methods
|
||||||
|
@ -202,7 +202,7 @@ def add_mega_download(mega_link, path, listener, name, from_queue=False):
|
|||||||
download_dict[listener.uid] = MegaDownloadStatus(mega_listener, listener)
|
download_dict[listener.uid] = MegaDownloadStatus(mega_listener, listener)
|
||||||
with queue_dict_lock:
|
with queue_dict_lock:
|
||||||
non_queued_dl.add(listener.uid)
|
non_queued_dl.add(listener.uid)
|
||||||
makedirs(path)
|
makedirs(path, exist_ok=True)
|
||||||
mega_listener.setValues(mname, size, gid)
|
mega_listener.setValues(mname, size, gid)
|
||||||
if not from_queue:
|
if not from_queue:
|
||||||
listener.onDownloadStart()
|
listener.onDownloadStart()
|
||||||
|
@ -171,6 +171,7 @@ def __stop_duplicate(client, tor):
|
|||||||
|
|
||||||
@new_thread
|
@new_thread
|
||||||
def __onDownloadComplete(client, tor):
|
def __onDownloadComplete(client, tor):
|
||||||
|
sleep(2)
|
||||||
download = getDownloadByGid(tor.hash[:12])
|
download = getDownloadByGid(tor.hash[:12])
|
||||||
try:
|
try:
|
||||||
listener = download.listener()
|
listener = download.listener()
|
||||||
|
@ -717,7 +717,7 @@ class GoogleDriveHelper:
|
|||||||
if meta.get("mimeType") == self.__G_DRIVE_DIR_MIME_TYPE:
|
if meta.get("mimeType") == self.__G_DRIVE_DIR_MIME_TYPE:
|
||||||
self.__download_folder(file_id, self.__path, self.name)
|
self.__download_folder(file_id, self.__path, self.name)
|
||||||
else:
|
else:
|
||||||
makedirs(self.__path)
|
makedirs(self.__path, exist_ok=True)
|
||||||
self.__download_file(file_id, self.__path, self.name, meta.get('mimeType'))
|
self.__download_file(file_id, self.__path, self.name, meta.get('mimeType'))
|
||||||
except Exception as err:
|
except Exception as err:
|
||||||
if isinstance(err, RetryError):
|
if isinstance(err, RetryError):
|
||||||
|
@ -1,11 +1,12 @@
|
|||||||
from logging import getLogger, ERROR
|
from logging import getLogger, ERROR
|
||||||
from os import remove as osremove, walk, path as ospath, rename as osrename
|
from os import remove as osremove, walk, path as ospath, rename as osrename
|
||||||
from time import time, sleep
|
from time import time, sleep
|
||||||
from pyrogram.types import InputMediaVideo, InputMediaDocument, InputMediaAudio, InputMediaPhoto
|
from pyrogram.types import InputMediaVideo, InputMediaDocument
|
||||||
from pyrogram.errors import FloodWait, RPCError
|
from pyrogram.errors import FloodWait, RPCError
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
from threading import RLock
|
from threading import RLock
|
||||||
from tenacity import retry, wait_exponential, stop_after_attempt, retry_if_exception_type, RetryError
|
from tenacity import retry, wait_exponential, stop_after_attempt, retry_if_exception_type, RetryError
|
||||||
|
from re import search as re_search
|
||||||
|
|
||||||
from bot import config_dict, user_data, GLOBAL_EXTENSION_FILTER, app
|
from bot import config_dict, user_data, GLOBAL_EXTENSION_FILTER, app
|
||||||
from bot.helper.ext_utils.fs_utils import take_ss, get_media_info, get_media_streams, clean_unwanted
|
from bot.helper.ext_utils.fs_utils import take_ss, get_media_info, get_media_streams, clean_unwanted
|
||||||
@ -34,7 +35,8 @@ class TgUploader:
|
|||||||
self.__resource_lock = RLock()
|
self.__resource_lock = RLock()
|
||||||
self.__is_corrupted = False
|
self.__is_corrupted = False
|
||||||
self.__size = size
|
self.__size = size
|
||||||
self.__media_dict = {'videos': {}, 'documents': {}, 'audios': {}, 'photos': {}}
|
self.__media_dict = {'videos': {}, 'documents': {}}
|
||||||
|
self.__last_msg_in_group = False
|
||||||
self.__msg_to_reply()
|
self.__msg_to_reply()
|
||||||
self.__user_settings()
|
self.__user_settings()
|
||||||
|
|
||||||
@ -55,22 +57,23 @@ class TgUploader:
|
|||||||
continue
|
continue
|
||||||
if self.__is_cancelled:
|
if self.__is_cancelled:
|
||||||
return
|
return
|
||||||
if self.__media_group:
|
if self.__last_msg_in_group:
|
||||||
group_lists = [x for v in self.__media_dict.values() for x in v.keys()]
|
group_lists = [x for v in self.__media_dict.values() for x in v.keys()]
|
||||||
if dirpath not in group_lists:
|
if (match := re_search(r'.+(?=\.0*\d+$)|.+(?=\.part\d+\..+)', up_path)) \
|
||||||
|
and not match.group(0) in group_lists:
|
||||||
for key, value in list(self.__media_dict.items()):
|
for key, value in list(self.__media_dict.items()):
|
||||||
for subkey, msgs in list(value.items()):
|
for subkey, msgs in list(value.items()):
|
||||||
if len(msgs) > 1:
|
if len(msgs) > 1:
|
||||||
self.__send_media_group(subkey, key, msgs)
|
self.__send_media_group(subkey, key, msgs)
|
||||||
|
self.__last_msg_in_group = False
|
||||||
up_path, cap_mono = self.__prepare_file(up_path, file_, dirpath)
|
up_path, cap_mono = self.__prepare_file(up_path, file_, dirpath)
|
||||||
self._last_uploaded = 0
|
self._last_uploaded = 0
|
||||||
self.__upload_file(up_path, dirpath, cap_mono)
|
self.__upload_file(up_path, cap_mono)
|
||||||
if not self.__is_cancelled and \
|
|
||||||
(not self.__listener.seed or self.__listener.newDir or dirpath.endswith("splited_files_mltb")):
|
|
||||||
osremove(up_path)
|
|
||||||
if self.__is_cancelled:
|
if self.__is_cancelled:
|
||||||
return
|
return
|
||||||
if (not self.__listener.isPrivate or config_dict['DUMP_CHAT']) and not self.__is_corrupted:
|
if not self.__listener.seed or self.__listener.newDir or dirpath.endswith("splited_files_mltb"):
|
||||||
|
osremove(up_path)
|
||||||
|
if not self.__is_corrupted and (not self.__listener.isPrivate or config_dict['DUMP_CHAT']):
|
||||||
self.__msgs_dict[self.__sent_msg.link] = file_
|
self.__msgs_dict[self.__sent_msg.link] = file_
|
||||||
sleep(1)
|
sleep(1)
|
||||||
except Exception as err:
|
except Exception as err:
|
||||||
@ -109,7 +112,7 @@ class TgUploader:
|
|||||||
|
|
||||||
@retry(wait=wait_exponential(multiplier=2, min=4, max=8), stop=stop_after_attempt(3),
|
@retry(wait=wait_exponential(multiplier=2, min=4, max=8), stop=stop_after_attempt(3),
|
||||||
retry=retry_if_exception_type(Exception))
|
retry=retry_if_exception_type(Exception))
|
||||||
def __upload_file(self, up_path, dirpath, cap_mono, force_document=False):
|
def __upload_file(self, up_path, cap_mono, force_document=False):
|
||||||
if self.__thumb is not None and not ospath.lexists(self.__thumb):
|
if self.__thumb is not None and not ospath.lexists(self.__thumb):
|
||||||
self.__thumb = None
|
self.__thumb = None
|
||||||
thumb = self.__thumb
|
thumb = self.__thumb
|
||||||
@ -179,26 +182,32 @@ class TgUploader:
|
|||||||
caption=cap_mono,
|
caption=cap_mono,
|
||||||
disable_notification=True,
|
disable_notification=True,
|
||||||
progress=self.__upload_progress)
|
progress=self.__upload_progress)
|
||||||
if self.__media_group and not self.__sent_msg.animation:
|
if not self.__is_cancelled and self.__media_group and (self.__sent_msg.video or self.__sent_msg.document):
|
||||||
if dirpath in self.__media_dict[key].keys():
|
key = 'documents' if self.__sent_msg.document else 'videos'
|
||||||
self.__media_dict[key][dirpath].append(self.__sent_msg)
|
if match := re_search(r'.+(?=\.0*\d+$)|.+(?=\.part\d+\..+)', up_path):
|
||||||
else:
|
pname = match.group(0)
|
||||||
self.__media_dict[key][dirpath] = [self.__sent_msg]
|
if pname in self.__media_dict[key].keys():
|
||||||
msgs = self.__media_dict[key][dirpath]
|
self.__media_dict[key][pname].append(self.__sent_msg)
|
||||||
if len(msgs) == 10:
|
else:
|
||||||
self.__send_media_group(dirpath, key, msgs)
|
self.__media_dict[key][pname] = [self.__sent_msg]
|
||||||
|
msgs = self.__media_dict[key][pname]
|
||||||
|
if len(msgs) == 10:
|
||||||
|
self.__send_media_group(pname, key, msgs)
|
||||||
|
else:
|
||||||
|
self.__last_msg_in_group = True
|
||||||
except FloodWait as f:
|
except FloodWait as f:
|
||||||
LOGGER.warning(str(f))
|
LOGGER.warning(str(f))
|
||||||
sleep(f.value)
|
sleep(f.value)
|
||||||
except Exception as err:
|
except Exception as err:
|
||||||
err_type = "RPCError: " if isinstance(err, RPCError) else ""
|
err_type = "RPCError: " if isinstance(err, RPCError) else ""
|
||||||
LOGGER.error(f"{err_type}{err}. Path: {up_path}")
|
LOGGER.error(f"{err_type}{err}. Path: {up_path}")
|
||||||
if self.__thumb is None and thumb is not None and ospath.lexists(thumb):
|
|
||||||
osremove(thumb)
|
|
||||||
if 'Telegram says: [400' in str(err) and key != 'documents':
|
if 'Telegram says: [400' in str(err) and key != 'documents':
|
||||||
LOGGER.error(f"Retrying As Document. Path: {up_path}")
|
LOGGER.error(f"Retrying As Document. Path: {up_path}")
|
||||||
return self.__upload_file(up_path, dirpath, cap_mono, True)
|
return self.__upload_file(up_path, dirpath, cap_mono, True)
|
||||||
raise err
|
raise err
|
||||||
|
finally:
|
||||||
|
if self.__thumb is None and thumb is not None and ospath.lexists(thumb):
|
||||||
|
osremove(thumb)
|
||||||
|
|
||||||
def __upload_progress(self, current, total):
|
def __upload_progress(self, current, total):
|
||||||
if self.__is_cancelled:
|
if self.__is_cancelled:
|
||||||
@ -232,12 +241,8 @@ class TgUploader:
|
|||||||
for msg in self.__media_dict[key][subkey]:
|
for msg in self.__media_dict[key][subkey]:
|
||||||
if key == 'videos':
|
if key == 'videos':
|
||||||
input_media = InputMediaVideo(media=msg.video.file_id, caption=msg.caption)
|
input_media = InputMediaVideo(media=msg.video.file_id, caption=msg.caption)
|
||||||
elif key == 'documents':
|
|
||||||
input_media = InputMediaDocument(media=msg.document.file_id, caption=msg.caption)
|
|
||||||
elif key == 'photos':
|
|
||||||
input_media = InputMediaPhoto(media=msg.photo.file_id, caption=msg.caption)
|
|
||||||
else:
|
else:
|
||||||
input_media = InputMediaAudio(media=msg.audio.file_id, caption=msg.caption)
|
input_media = InputMediaDocument(media=msg.document.file_id, caption=msg.caption)
|
||||||
rlist.append(input_media)
|
rlist.append(input_media)
|
||||||
return rlist
|
return rlist
|
||||||
|
|
||||||
|
@ -380,11 +380,16 @@ def get_buttons(key=None, edit_type=None):
|
|||||||
buttons.sbutton(int(x/10), f"botset start qbit {x}", position='footer')
|
buttons.sbutton(int(x/10), f"botset start qbit {x}", position='footer')
|
||||||
msg = f'Qbittorrent Options | Page: {int(START/10)} | State: {STATE}'
|
msg = f'Qbittorrent Options | Page: {int(START/10)} | State: {STATE}'
|
||||||
elif edit_type == 'editvar':
|
elif edit_type == 'editvar':
|
||||||
|
msg = ''
|
||||||
buttons.sbutton('Back', "botset back var")
|
buttons.sbutton('Back', "botset back var")
|
||||||
if key not in ['TELEGRAM_HASH', 'TELEGRAM_API', 'OWNER_ID', 'BOT_TOKEN']:
|
if key not in ['TELEGRAM_HASH', 'TELEGRAM_API', 'OWNER_ID', 'BOT_TOKEN']:
|
||||||
buttons.sbutton('Default', f"botset resetvar {key}")
|
buttons.sbutton('Default', f"botset resetvar {key}")
|
||||||
buttons.sbutton('Close', "botset close")
|
buttons.sbutton('Close', "botset close")
|
||||||
msg = f'Send a valid value for {key}. Timeout: 60 sec'
|
if key in ['SUDO_USERS', 'RSS_USER_SESSION_STRING', 'IGNORE_PENDING_REQUESTS', 'CMD_SUFFIX', 'OWNER_ID',
|
||||||
|
'USER_SESSION_STRING', 'TELEGRAM_HASH', 'TELEGRAM_API', 'AUTHORIZED_CHATS', 'RSS_DELAY'
|
||||||
|
'DATABASE_URL', 'BOT_TOKEN', 'DOWNLOAD_DIR']:
|
||||||
|
msg += 'Restart required for this edit to take effect!\n\n'
|
||||||
|
msg += f'Send a valid value for {key}. Timeout: 60 sec'
|
||||||
elif edit_type == 'editaria':
|
elif edit_type == 'editaria':
|
||||||
buttons.sbutton('Back', "botset back aria")
|
buttons.sbutton('Back', "botset back aria")
|
||||||
if key != 'newkey':
|
if key != 'newkey':
|
||||||
@ -717,12 +722,7 @@ def edit_bot_settings(update, context):
|
|||||||
update_buttons(message)
|
update_buttons(message)
|
||||||
dispatcher.remove_handler(file_handler)
|
dispatcher.remove_handler(file_handler)
|
||||||
elif data[1] == 'editvar' and STATE == 'edit':
|
elif data[1] == 'editvar' and STATE == 'edit':
|
||||||
if data[2] in ['SUDO_USERS', 'RSS_USER_SESSION_STRING', 'IGNORE_PENDING_REQUESTS', 'CMD_SUFFIX', 'OWNER_ID',
|
query.answer()
|
||||||
'USER_SESSION_STRING', 'TELEGRAM_HASH', 'TELEGRAM_API', 'AUTHORIZED_CHATS', 'RSS_DELAY'
|
|
||||||
'DATABASE_URL', 'BOT_TOKEN', 'DOWNLOAD_DIR']:
|
|
||||||
query.answer(text='Restart required for this edit to take effect!', show_alert=True)
|
|
||||||
else:
|
|
||||||
query.answer()
|
|
||||||
if handler_dict.get(message.chat.id):
|
if handler_dict.get(message.chat.id):
|
||||||
handler_dict[message.chat.id] = False
|
handler_dict[message.chat.id] = False
|
||||||
sleep(0.5)
|
sleep(0.5)
|
||||||
|
@ -1,9 +1,10 @@
|
|||||||
from requests import utils as rutils
|
from requests import utils as rutils
|
||||||
from re import search as re_search
|
from re import search as re_search
|
||||||
from time import sleep
|
from time import sleep
|
||||||
from os import path as ospath, remove as osremove, listdir, walk
|
from os import path as ospath, remove as osremove, listdir, walk, rename, makedirs
|
||||||
from subprocess import Popen
|
from subprocess import Popen
|
||||||
from html import escape
|
from html import escape
|
||||||
|
from shutil import move
|
||||||
|
|
||||||
from bot import Interval, aria2, DOWNLOAD_DIR, download_dict, download_dict_lock, LOGGER, DATABASE_URL, MAX_SPLIT_SIZE, config_dict, status_reply_dict_lock, user_data, non_queued_up, non_queued_dl, queued_up, queued_dl, queue_dict_lock
|
from bot import Interval, aria2, DOWNLOAD_DIR, download_dict, download_dict_lock, LOGGER, DATABASE_URL, MAX_SPLIT_SIZE, config_dict, status_reply_dict_lock, user_data, non_queued_up, non_queued_dl, queued_up, queued_dl, queue_dict_lock
|
||||||
from bot.helper.ext_utils.fs_utils import get_base_name, get_path_size, split_file, clean_download, clean_target
|
from bot.helper.ext_utils.fs_utils import get_base_name, get_path_size, split_file, clean_download, clean_target
|
||||||
@ -23,7 +24,7 @@ from bot.helper.ext_utils.db_handler import DbManger
|
|||||||
|
|
||||||
|
|
||||||
class MirrorLeechListener:
|
class MirrorLeechListener:
|
||||||
def __init__(self, bot, message, isZip=False, extract=False, isQbit=False, isLeech=False, pswd=None, tag=None, select=False, seed=False):
|
def __init__(self, bot, message, isZip=False, extract=False, isQbit=False, isLeech=False, pswd=None, tag=None, select=False, seed=False, sameDir=''):
|
||||||
self.bot = bot
|
self.bot = bot
|
||||||
self.message = message
|
self.message = message
|
||||||
self.uid = message.message_id
|
self.uid = message.message_id
|
||||||
@ -40,6 +41,7 @@ class MirrorLeechListener:
|
|||||||
self.isPrivate = message.chat.type in ['private', 'group']
|
self.isPrivate = message.chat.type in ['private', 'group']
|
||||||
self.suproc = None
|
self.suproc = None
|
||||||
self.queuedUp = False
|
self.queuedUp = False
|
||||||
|
self.sameDir = sameDir
|
||||||
|
|
||||||
def clean(self):
|
def clean(self):
|
||||||
try:
|
try:
|
||||||
@ -57,13 +59,28 @@ class MirrorLeechListener:
|
|||||||
|
|
||||||
def onDownloadComplete(self):
|
def onDownloadComplete(self):
|
||||||
with download_dict_lock:
|
with download_dict_lock:
|
||||||
|
if len(self.sameDir) > 1:
|
||||||
|
LOGGER.info(self.sameDir)
|
||||||
|
self.sameDir.remove(self.uid)
|
||||||
|
LOGGER.info(self.sameDir)
|
||||||
|
folder_name = listdir(self.dir)[-1]
|
||||||
|
path = f"{self.dir}/{folder_name}"
|
||||||
|
des_path = f"{DOWNLOAD_DIR}{list(self.sameDir)[0]}/{folder_name}"
|
||||||
|
makedirs(des_path, exist_ok=True)
|
||||||
|
for subdir in listdir(path):
|
||||||
|
sub_path = f"{self.dir}/{folder_name}/{subdir}"
|
||||||
|
if subdir in listdir(des_path):
|
||||||
|
sub_path = rename(sub_path, f"{self.dir}/{folder_name}/1-{subdir}")
|
||||||
|
move(sub_path, des_path)
|
||||||
|
del download_dict[self.uid]
|
||||||
|
return
|
||||||
download = download_dict[self.uid]
|
download = download_dict[self.uid]
|
||||||
name = str(download.name()).replace('/', '')
|
name = str(download.name()).replace('/', '')
|
||||||
gid = download.gid()
|
gid = download.gid()
|
||||||
LOGGER.info(f"Download completed: {name}")
|
LOGGER.info(f"Download completed: {name}")
|
||||||
if name == "None" or self.isQbit or not ospath.exists(f"{self.dir}/{name}"):
|
if name == "None" or self.isQbit or not ospath.exists(f"{self.dir}/{name}"):
|
||||||
name = listdir(self.dir)[-1]
|
name = listdir(self.dir)[-1]
|
||||||
m_path = f'{self.dir}/{name}'
|
m_path = f"{self.dir}/{name}"
|
||||||
size = get_path_size(m_path)
|
size = get_path_size(m_path)
|
||||||
with queue_dict_lock:
|
with queue_dict_lock:
|
||||||
if self.uid in non_queued_dl:
|
if self.uid in non_queued_dl:
|
||||||
@ -324,6 +341,8 @@ class MirrorLeechListener:
|
|||||||
if self.uid in download_dict.keys():
|
if self.uid in download_dict.keys():
|
||||||
del download_dict[self.uid]
|
del download_dict[self.uid]
|
||||||
count = len(download_dict)
|
count = len(download_dict)
|
||||||
|
if self.uid in self.sameDir:
|
||||||
|
self.sameDir.remove(self.uid)
|
||||||
msg = f"{self.tag} your download has been stopped due to: {escape(error)}"
|
msg = f"{self.tag} your download has been stopped due to: {escape(error)}"
|
||||||
sendMessage(msg, self.bot, self.message)
|
sendMessage(msg, self.bot, self.message)
|
||||||
if count == 0:
|
if count == 0:
|
||||||
@ -355,6 +374,8 @@ class MirrorLeechListener:
|
|||||||
if self.uid in download_dict.keys():
|
if self.uid in download_dict.keys():
|
||||||
del download_dict[self.uid]
|
del download_dict[self.uid]
|
||||||
count = len(download_dict)
|
count = len(download_dict)
|
||||||
|
if self.uid in self.sameDir:
|
||||||
|
self.sameDir.remove(self.uid)
|
||||||
sendMessage(f"{self.tag} {escape(error)}", self.bot, self.message)
|
sendMessage(f"{self.tag} {escape(error)}", self.bot, self.message)
|
||||||
if count == 0:
|
if count == 0:
|
||||||
self.clean()
|
self.clean()
|
||||||
|
@ -21,7 +21,7 @@ from bot.helper.telegram_helper.message_utils import sendMessage
|
|||||||
from .listener import MirrorLeechListener
|
from .listener import MirrorLeechListener
|
||||||
|
|
||||||
|
|
||||||
def _mirror_leech(bot, message, isZip=False, extract=False, isQbit=False, isLeech=False):
|
def _mirror_leech(bot, message, isZip=False, extract=False, isQbit=False, isLeech=False, sameDir=''):
|
||||||
if not isLeech and not config_dict['GDRIVE_ID']:
|
if not isLeech and not config_dict['GDRIVE_ID']:
|
||||||
sendMessage('GDRIVE_ID not Provided!', bot, message)
|
sendMessage('GDRIVE_ID not Provided!', bot, message)
|
||||||
return
|
return
|
||||||
@ -34,9 +34,10 @@ def _mirror_leech(bot, message, isZip=False, extract=False, isQbit=False, isLeec
|
|||||||
seed = False
|
seed = False
|
||||||
multi = 0
|
multi = 0
|
||||||
link = ''
|
link = ''
|
||||||
|
folder_name = ''
|
||||||
|
|
||||||
if len(message_args) > 1:
|
if len(message_args) > 1:
|
||||||
args = mesg[0].split(maxsplit=3)
|
args = mesg[0].split(maxsplit=4)
|
||||||
for x in args:
|
for x in args:
|
||||||
x = x.strip()
|
x = x.strip()
|
||||||
if x in ['|', 'pswd:']:
|
if x in ['|', 'pswd:']:
|
||||||
@ -57,12 +58,23 @@ def _mirror_leech(bot, message, isZip=False, extract=False, isQbit=False, isLeec
|
|||||||
elif x.isdigit():
|
elif x.isdigit():
|
||||||
multi = int(x)
|
multi = int(x)
|
||||||
mi = index
|
mi = index
|
||||||
|
elif x.startswith('m:'):
|
||||||
|
marg = x.split('m:', 1)
|
||||||
|
if len(marg) > 1:
|
||||||
|
folder_name = f"/{marg[-1]}"
|
||||||
|
if not sameDir:
|
||||||
|
sameDir = set()
|
||||||
|
sameDir.add(message.message_id)
|
||||||
if multi == 0:
|
if multi == 0:
|
||||||
message_args = mesg[0].split(maxsplit=index)
|
message_args = mesg[0].split(maxsplit=index)
|
||||||
if len(message_args) > index:
|
if len(message_args) > index:
|
||||||
link = message_args[index].strip()
|
link = message_args[index].strip()
|
||||||
if link.startswith(("|", "pswd:")):
|
if link.startswith(("|", "pswd:")):
|
||||||
link = ''
|
link = ''
|
||||||
|
if len(folder_name) > 0:
|
||||||
|
seed = False
|
||||||
|
ratio = None
|
||||||
|
seed_time = None
|
||||||
|
|
||||||
def __run_multi():
|
def __run_multi():
|
||||||
if multi > 1:
|
if multi > 1:
|
||||||
@ -72,9 +84,13 @@ def _mirror_leech(bot, message, isZip=False, extract=False, isQbit=False, isLeec
|
|||||||
msg = message.text.split(maxsplit=mi+1)
|
msg = message.text.split(maxsplit=mi+1)
|
||||||
msg[mi] = f"{multi - 1}"
|
msg[mi] = f"{multi - 1}"
|
||||||
nextmsg = sendMessage(" ".join(msg), bot, nextmsg)
|
nextmsg = sendMessage(" ".join(msg), bot, nextmsg)
|
||||||
|
if len(folder_name) > 0:
|
||||||
|
sameDir.add(nextmsg.message_id)
|
||||||
nextmsg.from_user.id = message.from_user.id
|
nextmsg.from_user.id = message.from_user.id
|
||||||
sleep(4)
|
sleep(4)
|
||||||
Thread(target=_mirror_leech, args=(bot, nextmsg, isZip, extract, isQbit, isLeech)).start()
|
Thread(target=_mirror_leech, args=(bot, nextmsg, isZip, extract, isQbit, isLeech, sameDir)).start()
|
||||||
|
|
||||||
|
path = f'{DOWNLOAD_DIR}{message.message_id}{folder_name}'
|
||||||
|
|
||||||
name = mesg[0].split('|', maxsplit=1)
|
name = mesg[0].split('|', maxsplit=1)
|
||||||
if len(name) > 1:
|
if len(name) > 1:
|
||||||
@ -113,8 +129,8 @@ def _mirror_leech(bot, message, isZip=False, extract=False, isQbit=False, isLeec
|
|||||||
elif isinstance(file_, list):
|
elif isinstance(file_, list):
|
||||||
link = file_[-1].get_file().file_path
|
link = file_[-1].get_file().file_path
|
||||||
elif not isQbit and file_.mime_type != "application/x-bittorrent":
|
elif not isQbit and file_.mime_type != "application/x-bittorrent":
|
||||||
listener = MirrorLeechListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag)
|
listener = MirrorLeechListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag, sameDir=sameDir)
|
||||||
Thread(target=TelegramDownloadHelper(listener).add_download, args=(message, f'{DOWNLOAD_DIR}{listener.uid}/', name)).start()
|
Thread(target=TelegramDownloadHelper(listener).add_download, args=(message, f'{path}/', name)).start()
|
||||||
__run_multi()
|
__run_multi()
|
||||||
return
|
return
|
||||||
else:
|
else:
|
||||||
@ -145,6 +161,10 @@ Those options should be always before |newname or pswd:
|
|||||||
<code>/cmd</code> 10(number of links/files)
|
<code>/cmd</code> 10(number of links/files)
|
||||||
Number should be always before |newname or pswd:
|
Number should be always before |newname or pswd:
|
||||||
|
|
||||||
|
<b>Multi links within same upload directory only by replying to first link/file:</b>
|
||||||
|
<code>/cmd</code> 10(number of links/files) m:folder_name
|
||||||
|
Number and m:folder_name should be always before |newname or pswd:
|
||||||
|
|
||||||
<b>NOTES:</b>
|
<b>NOTES:</b>
|
||||||
1. When use cmd by reply don't add any option in link msg! always add them after cmd msg!
|
1. When use cmd by reply don't add any option in link msg! always add them after cmd msg!
|
||||||
2. You can't add those options <b>|newname, pswd:</b> randomly. They should be arranged like exmaple above, rename then pswd. Those options should be after the link if link along with the cmd and after any other option
|
2. You can't add those options <b>|newname, pswd:</b> randomly. They should be arranged like exmaple above, rename then pswd. Those options should be after the link if link along with the cmd and after any other option
|
||||||
@ -201,7 +221,7 @@ Number should be always before |newname or pswd:
|
|||||||
__run_multi()
|
__run_multi()
|
||||||
return
|
return
|
||||||
|
|
||||||
listener = MirrorLeechListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag, select, seed)
|
listener = MirrorLeechListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag, select, seed, sameDir)
|
||||||
|
|
||||||
if is_gdrive_link(link):
|
if is_gdrive_link(link):
|
||||||
if not isZip and not extract and not isLeech:
|
if not isZip and not extract and not isLeech:
|
||||||
@ -210,11 +230,11 @@ Number should be always before |newname or pswd:
|
|||||||
gmsg += f"Use /{BotCommands.UnzipMirrorCommand[0]} to extracts Google Drive archive folder/file"
|
gmsg += f"Use /{BotCommands.UnzipMirrorCommand[0]} to extracts Google Drive archive folder/file"
|
||||||
sendMessage(gmsg, bot, message)
|
sendMessage(gmsg, bot, message)
|
||||||
else:
|
else:
|
||||||
Thread(target=add_gd_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}', listener, name)).start()
|
Thread(target=add_gd_download, args=(link, path, listener, name)).start()
|
||||||
elif is_mega_link(link):
|
elif is_mega_link(link):
|
||||||
Thread(target=add_mega_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}/', listener, name)).start()
|
Thread(target=add_mega_download, args=(link, f'{path}/', listener, name)).start()
|
||||||
elif isQbit and (is_magnet(link) or ospath.exists(link)):
|
elif isQbit and (is_magnet(link) or ospath.exists(link)):
|
||||||
Thread(target=add_qb_torrent, args=(link, f'{DOWNLOAD_DIR}{listener.uid}', listener,
|
Thread(target=add_qb_torrent, args=(link, path, listener,
|
||||||
ratio, seed_time)).start()
|
ratio, seed_time)).start()
|
||||||
else:
|
else:
|
||||||
if len(mesg) > 1:
|
if len(mesg) > 1:
|
||||||
@ -227,7 +247,7 @@ Number should be always before |newname or pswd:
|
|||||||
auth = "Basic " + b64encode(auth.encode()).decode('ascii')
|
auth = "Basic " + b64encode(auth.encode()).decode('ascii')
|
||||||
else:
|
else:
|
||||||
auth = ''
|
auth = ''
|
||||||
Thread(target=add_aria2c_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}', listener, name,
|
Thread(target=add_aria2c_download, args=(link, path, listener, name,
|
||||||
auth, ratio, seed_time)).start()
|
auth, ratio, seed_time)).start()
|
||||||
__run_multi()
|
__run_multi()
|
||||||
|
|
||||||
|
@ -44,10 +44,8 @@ def get_user_settings(from_user):
|
|||||||
MG = config_dict['MEDIA_GROUP']
|
MG = config_dict['MEDIA_GROUP']
|
||||||
if not user_dict and MG or user_dict.get('media_group') or 'media_group' not in user_dict and MG:
|
if not user_dict and MG or user_dict.get('media_group') or 'media_group' not in user_dict and MG:
|
||||||
media_group = 'Enabled'
|
media_group = 'Enabled'
|
||||||
buttons.sbutton("Disable Media Group", f"userset {user_id} mgroup")
|
|
||||||
else:
|
else:
|
||||||
media_group = 'Disabled'
|
media_group = 'Disabled'
|
||||||
buttons.sbutton("Enable Media Group", f"userset {user_id} mgroup")
|
|
||||||
|
|
||||||
buttons.sbutton("YT-DLP Quality", f"userset {user_id} ytq")
|
buttons.sbutton("YT-DLP Quality", f"userset {user_id} ytq")
|
||||||
YQ = config_dict['YT_DLP_QUALITY']
|
YQ = config_dict['YT_DLP_QUALITY']
|
||||||
@ -191,7 +189,7 @@ def edit_user_settings(update, context):
|
|||||||
handler_dict[user_id] = True
|
handler_dict[user_id] = True
|
||||||
buttons = ButtonMaker()
|
buttons = ButtonMaker()
|
||||||
buttons.sbutton("Back", f"userset {user_id} back")
|
buttons.sbutton("Back", f"userset {user_id} back")
|
||||||
if user_dict.get('yt_ql'):
|
if user_dict.get('yt_ql') or config_dict['YT_DLP_QUALITY']:
|
||||||
buttons.sbutton("Remove YT-DLP Quality", f"userset {user_id} rytq", 'header')
|
buttons.sbutton("Remove YT-DLP Quality", f"userset {user_id} rytq", 'header')
|
||||||
buttons.sbutton("Close", f"userset {user_id} close")
|
buttons.sbutton("Close", f"userset {user_id} close")
|
||||||
rmsg = f'''
|
rmsg = f'''
|
||||||
@ -228,10 +226,16 @@ Check all available qualities options <a href="https://github.com/yt-dlp/yt-dlp#
|
|||||||
buttons = ButtonMaker()
|
buttons = ButtonMaker()
|
||||||
if user_dict.get('split_size'):
|
if user_dict.get('split_size'):
|
||||||
buttons.sbutton("Reset Split Size", f"userset {user_id} rlss")
|
buttons.sbutton("Reset Split Size", f"userset {user_id} rlss")
|
||||||
if not user_dict and config_dict['EQUAL_SPLITS'] or user_dict.get('equal_splits'):
|
ES = config_dict['EQUAL_SPLITS']
|
||||||
|
if not user_dict and ES or user_dict.get('equal_splits') or 'equal_splits' not in user_dict and ES:
|
||||||
buttons.sbutton("Disable Equal Splits", f"userset {user_id} esplits")
|
buttons.sbutton("Disable Equal Splits", f"userset {user_id} esplits")
|
||||||
else:
|
else:
|
||||||
buttons.sbutton("Enable Equal Splits", f"userset {user_id} esplits")
|
buttons.sbutton("Enable Equal Splits", f"userset {user_id} esplits")
|
||||||
|
MG = config_dict['MEDIA_GROUP']
|
||||||
|
if not user_dict and MG or user_dict.get('media_group') or 'media_group' not in user_dict and MG:
|
||||||
|
buttons.sbutton("Disable Media Group", f"userset {user_id} mgroup")
|
||||||
|
else:
|
||||||
|
buttons.sbutton("Enable Media Group", f"userset {user_id} mgroup")
|
||||||
buttons.sbutton("Back", f"userset {user_id} back")
|
buttons.sbutton("Back", f"userset {user_id} back")
|
||||||
buttons.sbutton("Close", f"userset {user_id} close")
|
buttons.sbutton("Close", f"userset {user_id} close")
|
||||||
editMessage(f'Send Leech split size in bytes. IS_PREMIUM_USER: {IS_PREMIUM_USER}. Timeout: 60 sec', message, buttons.build_menu(1))
|
editMessage(f'Send Leech split size in bytes. IS_PREMIUM_USER: {IS_PREMIUM_USER}. Timeout: 60 sec', message, buttons.build_menu(1))
|
||||||
@ -260,6 +264,7 @@ Check all available qualities options <a href="https://github.com/yt-dlp/yt-dlp#
|
|||||||
DbManger().update_user_data(user_id)
|
DbManger().update_user_data(user_id)
|
||||||
elif data[2] == 'mgroup':
|
elif data[2] == 'mgroup':
|
||||||
query.answer()
|
query.answer()
|
||||||
|
handler_dict[user_id] = False
|
||||||
update_user_ldata(user_id, 'media_group', not bool(user_dict.get('media_group')))
|
update_user_ldata(user_id, 'media_group', not bool(user_dict.get('media_group')))
|
||||||
update_user_settings(message, query.from_user)
|
update_user_settings(message, query.from_user)
|
||||||
if DATABASE_URL:
|
if DATABASE_URL:
|
||||||
|
@ -14,7 +14,7 @@ from .listener import MirrorLeechListener
|
|||||||
|
|
||||||
listener_dict = {}
|
listener_dict = {}
|
||||||
|
|
||||||
def _ytdl(bot, message, isZip=False, isLeech=False):
|
def _ytdl(bot, message, isZip=False, isLeech=False, sameDir=''):
|
||||||
if not isLeech and not config_dict['GDRIVE_ID']:
|
if not isLeech and not config_dict['GDRIVE_ID']:
|
||||||
sendMessage('GDRIVE_ID not Provided!', bot, message)
|
sendMessage('GDRIVE_ID not Provided!', bot, message)
|
||||||
return
|
return
|
||||||
@ -26,6 +26,7 @@ def _ytdl(bot, message, isZip=False, isLeech=False):
|
|||||||
multi = 0
|
multi = 0
|
||||||
index = 1
|
index = 1
|
||||||
link = ''
|
link = ''
|
||||||
|
folder_name = ''
|
||||||
|
|
||||||
args = mssg.split(maxsplit=2)
|
args = mssg.split(maxsplit=2)
|
||||||
if len(args) > 1:
|
if len(args) > 1:
|
||||||
@ -39,6 +40,13 @@ def _ytdl(bot, message, isZip=False, isLeech=False):
|
|||||||
elif x.strip().isdigit():
|
elif x.strip().isdigit():
|
||||||
multi = int(x)
|
multi = int(x)
|
||||||
mi = index
|
mi = index
|
||||||
|
elif x.startswith('m:'):
|
||||||
|
marg = x.split('m:', 1)
|
||||||
|
if len(marg) > 1:
|
||||||
|
folder_name = f"/{marg[-1]}"
|
||||||
|
if not sameDir:
|
||||||
|
sameDir = set()
|
||||||
|
sameDir.add(message.message_id)
|
||||||
if multi == 0:
|
if multi == 0:
|
||||||
args = mssg.split(maxsplit=index)
|
args = mssg.split(maxsplit=index)
|
||||||
if len(args) > index:
|
if len(args) > index:
|
||||||
@ -57,9 +65,13 @@ def _ytdl(bot, message, isZip=False, isLeech=False):
|
|||||||
ymsg = mssg.split(maxsplit=mi+1)
|
ymsg = mssg.split(maxsplit=mi+1)
|
||||||
ymsg[mi] = f"{multi - 1}"
|
ymsg[mi] = f"{multi - 1}"
|
||||||
nextmsg = sendMessage(" ".join(ymsg), bot, nextmsg)
|
nextmsg = sendMessage(" ".join(ymsg), bot, nextmsg)
|
||||||
|
if len(folder_name) > 0:
|
||||||
|
sameDir.add(nextmsg.message_id)
|
||||||
nextmsg.from_user.id = message.from_user.id
|
nextmsg.from_user.id = message.from_user.id
|
||||||
sleep(4)
|
sleep(4)
|
||||||
Thread(target=_ytdl, args=(bot, nextmsg, isZip, isLeech)).start()
|
Thread(target=_ytdl, args=(bot, nextmsg, isZip, isLeech, sameDir)).start()
|
||||||
|
|
||||||
|
path = f'{DOWNLOAD_DIR}{message.message_id}{folder_name}'
|
||||||
|
|
||||||
name = mssg.split('|', maxsplit=1)
|
name = mssg.split('|', maxsplit=1)
|
||||||
if len(name) > 1:
|
if len(name) > 1:
|
||||||
@ -109,6 +121,10 @@ This option should be always before |newname, pswd: and opt:
|
|||||||
<code>/cmd</code> 10(number of links)
|
<code>/cmd</code> 10(number of links)
|
||||||
Number should be always before |newname, pswd: and opt:
|
Number should be always before |newname, pswd: and opt:
|
||||||
|
|
||||||
|
<b>Multi links within same upload directory only by replying to first link:</b>
|
||||||
|
<code>/cmd</code> 10(number of links) m:folder_name
|
||||||
|
Number and m:folder_name should be always before |newname, pswd: and opt:
|
||||||
|
|
||||||
<b>Options Note:</b> Add `^` before integer, some values must be integer and some string.
|
<b>Options Note:</b> Add `^` before integer, some values must be integer and some string.
|
||||||
Like playlist_items:10 works with string, so no need to add `^` before the number but playlistend works only with integer so you must add `^` before the number like example above.
|
Like playlist_items:10 works with string, so no need to add `^` before the number but playlistend works only with integer so you must add `^` before the number like example above.
|
||||||
You can add tuple and dict also. Use double quotes inside dict.
|
You can add tuple and dict also. Use double quotes inside dict.
|
||||||
@ -123,7 +139,7 @@ Check all yt-dlp api options from this <a href='https://github.com/yt-dlp/yt-dlp
|
|||||||
"""
|
"""
|
||||||
return sendMessage(help_msg, bot, message)
|
return sendMessage(help_msg, bot, message)
|
||||||
|
|
||||||
listener = MirrorLeechListener(bot, message, isZip, isLeech=isLeech, pswd=pswd, tag=tag)
|
listener = MirrorLeechListener(bot, message, isZip, isLeech=isLeech, pswd=pswd, tag=tag, sameDir=sameDir)
|
||||||
ydl = YoutubeDLHelper(listener)
|
ydl = YoutubeDLHelper(listener)
|
||||||
try:
|
try:
|
||||||
result = ydl.extractMetaData(link, name, opt, True)
|
result = ydl.extractMetaData(link, name, opt, True)
|
||||||
@ -147,7 +163,7 @@ Check all yt-dlp api options from this <a href='https://github.com/yt-dlp/yt-dlp
|
|||||||
if qual:
|
if qual:
|
||||||
playlist = 'entries' in result
|
playlist = 'entries' in result
|
||||||
LOGGER.info(f"Downloading with YT-DLP: {link}")
|
LOGGER.info(f"Downloading with YT-DLP: {link}")
|
||||||
Thread(target=ydl.add_download, args=(link, f'{DOWNLOAD_DIR}{msg_id}', name, qual, playlist, opt)).start()
|
Thread(target=ydl.add_download, args=(link, path, name, qual, playlist, opt)).start()
|
||||||
else:
|
else:
|
||||||
buttons = ButtonMaker()
|
buttons = ButtonMaker()
|
||||||
best_video = "bv*+ba/b"
|
best_video = "bv*+ba/b"
|
||||||
@ -168,7 +184,6 @@ Check all yt-dlp api options from this <a href='https://github.com/yt-dlp/yt-dlp
|
|||||||
buttons.sbutton("Best Audios", f"qu {msg_id} {best_audio} t")
|
buttons.sbutton("Best Audios", f"qu {msg_id} {best_audio} t")
|
||||||
buttons.sbutton("Cancel", f"qu {msg_id} cancel")
|
buttons.sbutton("Cancel", f"qu {msg_id} cancel")
|
||||||
YTBUTTONS = buttons.build_menu(3)
|
YTBUTTONS = buttons.build_menu(3)
|
||||||
listener_dict[msg_id] = [listener, user_id, link, name, YTBUTTONS, opt, formats_dict]
|
|
||||||
bmsg = sendMessage('Choose Playlist Videos Quality:', bot, message, YTBUTTONS)
|
bmsg = sendMessage('Choose Playlist Videos Quality:', bot, message, YTBUTTONS)
|
||||||
else:
|
else:
|
||||||
formats = result.get('formats')
|
formats = result.get('formats')
|
||||||
@ -222,9 +237,9 @@ Check all yt-dlp api options from this <a href='https://github.com/yt-dlp/yt-dlp
|
|||||||
buttons.sbutton("Best Audio", f"qu {msg_id} {best_audio}")
|
buttons.sbutton("Best Audio", f"qu {msg_id} {best_audio}")
|
||||||
buttons.sbutton("Cancel", f"qu {msg_id} cancel")
|
buttons.sbutton("Cancel", f"qu {msg_id} cancel")
|
||||||
YTBUTTONS = buttons.build_menu(2)
|
YTBUTTONS = buttons.build_menu(2)
|
||||||
listener_dict[msg_id] = [listener, user_id, link, name, YTBUTTONS, opt, formats_dict]
|
|
||||||
bmsg = sendMessage('Choose Video Quality:', bot, message, YTBUTTONS)
|
bmsg = sendMessage('Choose Video Quality:', bot, message, YTBUTTONS)
|
||||||
|
|
||||||
|
listener_dict[msg_id] = [listener, user_id, link, name, YTBUTTONS, opt, formats_dict, path]
|
||||||
Thread(target=_auto_cancel, args=(bmsg, msg_id)).start()
|
Thread(target=_auto_cancel, args=(bmsg, msg_id)).start()
|
||||||
|
|
||||||
__run_multi()
|
__run_multi()
|
||||||
@ -294,6 +309,7 @@ def select_format(update, context):
|
|||||||
name = task_info[3]
|
name = task_info[3]
|
||||||
opt = task_info[5]
|
opt = task_info[5]
|
||||||
qual = data[2]
|
qual = data[2]
|
||||||
|
path = task_info[7]
|
||||||
if len(data) == 4:
|
if len(data) == 4:
|
||||||
playlist = True
|
playlist = True
|
||||||
if '|' in qual:
|
if '|' in qual:
|
||||||
@ -305,7 +321,7 @@ def select_format(update, context):
|
|||||||
qual = task_info[6][b_name][tbr][1]
|
qual = task_info[6][b_name][tbr][1]
|
||||||
ydl = YoutubeDLHelper(listener)
|
ydl = YoutubeDLHelper(listener)
|
||||||
LOGGER.info(f"Downloading with YT-DLP: {link}")
|
LOGGER.info(f"Downloading with YT-DLP: {link}")
|
||||||
Thread(target=ydl.add_download, args=(link, f'{DOWNLOAD_DIR}{task_id}', name, qual, playlist, opt)).start()
|
Thread(target=ydl.add_download, args=(link, path, name, qual, playlist, opt)).start()
|
||||||
query.message.delete()
|
query.message.delete()
|
||||||
del listener_dict[task_id]
|
del listener_dict[task_id]
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user