diff --git a/README.md b/README.md index 48fc411f..140988cb 100644 --- a/README.md +++ b/README.md @@ -7,7 +7,7 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to - Select files from Torrent before downloading using qbittorrent and aria2c. - Leech (splitting, thumbnail for each user, setting as document or as media for each user). - Stop duplicates for all tasks except yt-dlp tasks. -- Zip/Unzip G-Drive links. +- Leech/Zip/Unzip G-Drive links. - Counting files/folders from Google Drive link. - View Link button, extra button to open file index link in broswer instead of direct download. - Status Pages for unlimited tasks. @@ -24,7 +24,7 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to - Search on torrents with Torrent Search API or with variable plugins using qBittorrent search engine - Docker image support for linux `amd64, arm64/v8, arm/v7, s390x`. - Update bot at startup and with restart command using `UPSTREAM_REPO`. -- Qbittorrent seed until reaching specific ratio or time. +- Bittorrent seed until reaching specific ratio or time. - Rss feed and filter. Based on this repository [rss-chan](https://github.com/hyPnOtICDo0g/rss-chan). - Save leech settings including thumbnails in database. - Mirror/Leech/Clone multi links/files with one command. @@ -34,7 +34,7 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to - Custom Name for all links except torrents. For files you should add extension except yt-dlp links. - Many bugs have been fixed. -## From Other Repositories +## From Base and other Repositories - Mirror direct download links, Torrent, and Telegram files to Google Drive - Mirror Mega.nz links to Google Drive - Copy files from someone's Drive to your Drive (Using Autorclone) @@ -49,11 +49,11 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to - Shell and Executor - Add sudo users - Extract password protected files -- Extract these filetypes and uploads to Google Drive +- Extract these filetypes > ZIP, RAR, TAR, 7z, ISO, WIM, CAB, GZIP, BZIP2, APM, ARJ, CHM, CPIO, CramFS, DEB, DMG, FAT, HFS, LZH, LZMA, LZMA2, MBR, MSI, MSLZ, NSIS, NTFS, RPM, SquashFS, UDF, VHD, XAR, Z, TAR.XZ - Direct links Supported: - >mediafire, letsupload.io, hxfile.co, anonfiles.com, bayfiles.com, antfiles, fembed.com, fembed.net, femax20.com, layarkacaxxi.icu, fcdn.stream, sbplay.org, naniplay.com, naniplay.nanime.in, naniplay.nanime.biz, sbembed.com, streamtape.com, streamsb.net, feurl.com, upload.ee, pixeldrain.com, racaty.net, 1fichier.com, 1drv.ms (Only works for file not folder or business account), uptobox.com (Uptobox account must be premium) and solidfiles.com + >mediafire, letsupload.io, hxfile.co, anonfiles.com, bayfiles.com, antfiles, fembed.com, fembed.net, femax20.com, layarkacaxxi.icu, fcdn.stream, sbplay.org, naniplay.com, naniplay.nanime.in, naniplay.nanime.biz, sbembed.com, streamtape.com, streamsb.net, feurl.com, upload.ee, pixeldrain.com, racaty.net, 1fichier.com, 1drv.ms (Only works for file not folder or business account), uptobox.com and solidfiles.com # How to deploy? @@ -140,7 +140,6 @@ Fill up rest of the fields. Meaning of each field is discussed below: - `BASE_URL_OF_BOT`: Valid BASE URL where the bot is deployed to use qbittorrent web selection. Format of URL should be `http://myip`, where `myip` is the IP/Domain(public) of your bot or if you have chosen port other than `80` so write it in this format `http://myip:port` (`http` and not `https`). This Var is optional on VPS and required for Heroku specially to avoid app sleeping/idling. For Heroku fill `https://yourappname.herokuapp.com`. Still got idling? You can use http://cron-job.org to ping your Heroku app. `Str` - `SERVER_PORT`: Only For VPS, which is the **BASE_URL_OF_BOT** Port. `Str` - `WEB_PINCODE`: If empty or `False` means no more pincode required while qbit web selection. `Bool` -- `QB_SEED`: QB torrent will be seeded after and while uploading until reaching specific ratio or time, edit `MaxRatio` or `GlobalMaxSeedingMinutes` or both from qbittorrent.conf (`-1` means no limit, but u can cancel manually by gid). **NOTE**: 1. Don't change `MaxRatioAction`, 2. Only works with `/qbmirror` and `/qbzipmirror`. Also you can use this feature for specific torrent while using the bot and leave this variable empty. Default is `False`. `Bool` - **Qbittorrent NOTE**: If your facing ram exceeded issue then set limit for `MaxConnecs`, decrease `AsyncIOThreadsCount` in qbittorrent config and set limit of `DiskWriteCacheSize` to `32`. ### RSS @@ -332,7 +331,22 @@ help - All cmds with description ------ +## Bittorrent Seed + +- Add `d:ratio:time` perfix along with leech or mirror cmd. +- Using `d` perfix alone will lead to use global options for aria2c or qbittorrent. + +### Qbittorrent +- Global options: `MaxRatio` and `GlobalMaxSeedingMinutes` in qbittorrent.conf, `-1` means no limit, but you can cancel manually. + - **NOTE**: Don't change `MaxRatioAction`. + +### Aria2c +- Global options: `--seed-ratio` (0 means no limit) and `--seed-time` (0 means no seed) in aria.sh. + +------ + ## Using Service Accounts for uploading to avoid user rate limit + >For Service Account to work, you must set `USE_SERVICE_ACCOUNTS` = "True" in config file or environment variables. >**NOTE**: Using Service Accounts is only recommended while uploading to a Team Drive. diff --git a/aria.sh b/aria.sh index 99714ff9..fd7cf310 100755 --- a/aria.sh +++ b/aria.sh @@ -3,12 +3,11 @@ then TORRENT_TIMEOUT=0 fi tracker_list=$(curl -Ns https://raw.githubusercontent.com/XIU2/TrackersListCollection/master/all.txt https://ngosang.github.io/trackerslist/trackers_all_http.txt https://newtrackon.com/api/all https://raw.githubusercontent.com/hezhijie0327/Trackerslist/main/trackerslist_tracker.txt | awk '$0' | tr '\n\n' ',') -aria2c --enable-rpc=true --check-certificate=false --daemon=true \ - --max-connection-per-server=10 --rpc-max-request-size=1024M --quiet=true \ - --bt-stop-timeout=$TORRENT_TIMEOUT --min-split-size=10M --split=10 --allow-overwrite=true \ - --max-overall-download-limit=0 --bt-tracker="[$tracker_list]" --disk-cache=32M \ - --max-overall-upload-limit=1K --max-concurrent-downloads=15 --summary-interval=0 \ - --peer-id-prefix=-qB4430- --user-agent=Wget/1.12 --peer-agent=qBittorrent/4.4.3\ - --bt-enable-lpd=true --seed-time=0 --max-file-not-found=0 --max-tries=20 --follow-torrent=mem \ - --auto-file-renaming=true --reuse-uri=true --http-accept-gzip=true --continue=true \ - --content-disposition-default-utf8=true --netrc-path=/usr/src/app/.netrc --bt-remove-unselected-file=true +aria2c --allow-overwrite=true --auto-file-renaming=true --bt-enable-lpd=true \ + --bt-remove-unselected-file=true --bt-stop-timeout=$TORRENT_TIMEOUT --bt-tracker="[$tracker_list]" \ + --check-certificate=false --continue=true --content-disposition-default-utf8=true --daemon=true \ + --disk-cache=40M --enable-rpc=true --follow-torrent=mem --force-save=true --http-accept-gzip=true \ + --max-connection-per-server=10 --max-concurrent-downloads=20 --max-file-not-found=0 --max-tries=20 \ + --min-split-size=10M --netrc-path=/usr/src/app/.netrc --optimize-concurrent-downloads=true \ + --peer-id-prefix=-qB4430- --peer-agent=qBittorrent/4.4.3 --quiet=true --reuse-uri=true \ + --rpc-max-request-size=1024M --seed-ratio=0 --split=10 --summary-interval=0 --user-agent=Wget/1.12 diff --git a/bot/__init__.py b/bot/__init__.py index 5af19e39..430d6e28 100644 --- a/bot/__init__.py +++ b/bot/__init__.py @@ -107,7 +107,7 @@ AUTHORIZED_CHATS = set() SUDO_USERS = set() AS_DOC_USERS = set() AS_MEDIA_USERS = set() -EXTENSION_FILTER = set() +EXTENSION_FILTER = set(['.aria2']) try: aid = getConfig('AUTHORIZED_CHATS') @@ -151,19 +151,19 @@ try: USER_SESSION_STRING = getConfig('USER_SESSION_STRING') if len(USER_SESSION_STRING) == 0: raise KeyError - LOGGER.info("Creating client from USER_SESSION_STRING") + log_info("Creating client from USER_SESSION_STRING") app = Client(name='pyrogram', api_id=int(TELEGRAM_API), api_hash=TELEGRAM_HASH, session_string=USER_SESSION_STRING, parse_mode=enums.ParseMode.HTML, no_updates=True) with app: IS_PREMIUM_USER = app.me.is_premium except: - LOGGER.info("Creating client from BOT_TOKEN") + log_info("Creating client from BOT_TOKEN") app = Client(name='pyrogram', api_id=int(TELEGRAM_API), api_hash=TELEGRAM_HASH, bot_token=BOT_TOKEN, parse_mode=enums.ParseMode.HTML, no_updates=True) try: RSS_USER_SESSION_STRING = getConfig('RSS_USER_SESSION_STRING') if len(RSS_USER_SESSION_STRING) == 0: raise KeyError - LOGGER.info("Creating client from RSS_USER_SESSION_STRING") + log_info("Creating client from RSS_USER_SESSION_STRING") rss_session = Client(name='rss_session', api_id=int(TELEGRAM_API), api_hash=TELEGRAM_HASH, session_string=RSS_USER_SESSION_STRING, parse_mode=enums.ParseMode.HTML, no_updates=True) except: rss_session = None @@ -212,10 +212,8 @@ try: raise KeyError LEECH_SPLIT_SIZE = int(LEECH_SPLIT_SIZE) except: - if not IS_PREMIUM_USER: - LEECH_SPLIT_SIZE = 2097152000 - else: - LEECH_SPLIT_SIZE = 4194304000 + LEECH_SPLIT_SIZE = 4194304000 if IS_PREMIUM_USER else 2097152000 +MAX_SPLIT_SIZE = 4194304000 if IS_PREMIUM_USER else 2097152000 try: STATUS_LIMIT = getConfig('STATUS_LIMIT') if len(STATUS_LIMIT) == 0: @@ -335,11 +333,6 @@ try: EQUAL_SPLITS = EQUAL_SPLITS.lower() == 'true' except: EQUAL_SPLITS = False -try: - QB_SEED = getConfig('QB_SEED') - QB_SEED = QB_SEED.lower() == 'true' -except: - QB_SEED = False try: CUSTOM_FILENAME = getConfig('CUSTOM_FILENAME') if len(CUSTOM_FILENAME) == 0: diff --git a/bot/__main__.py b/bot/__main__.py index 1e6fc285..0f621b16 100644 --- a/bot/__main__.py +++ b/bot/__main__.py @@ -9,16 +9,13 @@ from telegram.ext import CommandHandler from bot import bot, dispatcher, updater, botStartTime, IGNORE_PENDING_REQUESTS, LOGGER, Interval, INCOMPLETE_TASK_NOTIFIER, DB_URI, app, main_loop from .helper.ext_utils.fs_utils import start_cleanup, clean_all, exit_clean_up -from .helper.ext_utils.telegraph_helper import telegraph from .helper.ext_utils.bot_utils import get_readable_file_size, get_readable_time from .helper.ext_utils.db_handler import DbManger from .helper.telegram_helper.bot_commands import BotCommands from .helper.telegram_helper.message_utils import sendMessage, sendMarkup, editMessage, sendLogFile from .helper.telegram_helper.filters import CustomFilters from .helper.telegram_helper.button_build import ButtonMaker - -from .modules import authorize, list, cancel_mirror, mirror_status, mirror, clone, watch, shell, eval, delete, count, leech_settings, search, rss, bt_select - +from .modules import authorize, list, cancel_mirror, mirror_status, mirror_leech, clone, ytdlp, shell, eval, delete, count, leech_settings, search, rss, bt_select def stats(update, context): @@ -102,148 +99,60 @@ def ping(update, context): def log(update, context): sendLogFile(context.bot, update.message) - -help_string_telegraph = f'''
-/{BotCommands.HelpCommand}: To get this message -

-/{BotCommands.MirrorCommand} [download_url][magnet_link]: Start mirroring to Google Drive. Send /{BotCommands.MirrorCommand} for more help -

-/{BotCommands.ZipMirrorCommand} [download_url][magnet_link]: Start mirroring and upload the file/folder compressed with zip extension -

-/{BotCommands.UnzipMirrorCommand} [download_url][magnet_link]: Start mirroring and upload the file/folder extracted from any archive extension -

-/{BotCommands.QbMirrorCommand} [magnet_link][torrent_file][torrent_file_url]: Start Mirroring using qBittorrent. Send /{BotCommands.QbMirrorCommand} for more help -

-/{BotCommands.QbZipMirrorCommand} [magnet_link][torrent_file][torrent_file_url]: Start mirroring using qBittorrent and upload the file/folder compressed with zip extension -

-/{BotCommands.QbUnzipMirrorCommand} [magnet_link][torrent_file][torrent_file_url]: Start mirroring using qBittorrent and upload the file/folder extracted from any archive extension -

-/{BotCommands.LeechCommand} [download_url][magnet_link]: Start leeching to Telegram, Use /{BotCommands.LeechCommand} s to select files before leeching -

-/{BotCommands.ZipLeechCommand} [download_url][magnet_link]: Start leeching to Telegram and upload the file/folder compressed with zip extension -

-/{BotCommands.UnzipLeechCommand} [download_url][magnet_link][torent_file]: Start leeching to Telegram and upload the file/folder extracted from any archive extension -

-/{BotCommands.QbLeechCommand} [magnet_link][torrent_file][torrent_file_url]: Start leeching to Telegram using qBittorrent, Use /{BotCommands.QbLeechCommand} s to select files before leeching -

-/{BotCommands.QbZipLeechCommand} [magnet_link][torrent_file][torrent_file_url]: Start leeching to Telegram using qBittorrent and upload the file/folder compressed with zip extension -

-/{BotCommands.QbUnzipLeechCommand} [magnet_link][torrent_file][torrent_file_url]: Start leeching to Telegram using qBittorrent and upload the file/folder extracted from any archive extension -

-/{BotCommands.CloneCommand} [drive_url][gdtot_url]: Copy file/folder to Google Drive -

-/{BotCommands.CountCommand} [drive_url][gdtot_url]: Count file/folder of Google Drive -

-/{BotCommands.DeleteCommand} [drive_url]: Delete file/folder from Google Drive (Only Owner & Sudo) -

-/{BotCommands.WatchCommand} [yt-dlp supported link]: Mirror yt-dlp supported link. Send /{BotCommands.WatchCommand} for more help -

-/{BotCommands.ZipWatchCommand} [yt-dlp supported link]: Mirror yt-dlp supported link as zip -

-/{BotCommands.LeechWatchCommand} [yt-dlp supported link]: Leech yt-dlp supported link -

-/{BotCommands.LeechZipWatchCommand} [yt-dlp supported link]: Leech yt-dlp supported link as zip -

-/{BotCommands.LeechSetCommand}: Leech settings -

-/{BotCommands.SetThumbCommand}: Reply photo to set it as Thumbnail -

-/{BotCommands.BtSelectCommand}: Reply to an active /cmd which was used to start the bt-download or add gid along with cmd. This command mainly for selection incase you decided to select files from already added torrent. But you can always use /cmd with arg `s` to select files before download start -

-/{BotCommands.RssListCommand}: List all subscribed rss feed info -

-/{BotCommands.RssGetCommand}: [Title] [Number](last N links): Force fetch last N links -

-/{BotCommands.RssSubCommand}: [Title] [Rss Link] f: [filter]: Subscribe new rss feed -

-/{BotCommands.RssUnSubCommand}: [Title]: Unubscribe rss feed by title -

-/{BotCommands.RssSettingsCommand}: Rss Settings -

-/{BotCommands.CancelMirror}: Reply to the message by which the download was initiated and that download will be cancelled -

-/{BotCommands.CancelAllCommand}: Cancel all downloading tasks -

-/{BotCommands.ListCommand} [query]: Search in Google Drive(s) -

-/{BotCommands.SearchCommand} [query]: Search for torrents with API -
sites: rarbg, 1337x, yts, etzv, tgx, torlock, piratebay, nyaasi, ettv

-/{BotCommands.StatusCommand}: Shows a status of all the downloads -

-/{BotCommands.StatsCommand}: Show Stats of the machine the bot is hosted on -''' - -help = telegraph.create_page( - title='Mirror-Leech-Bot Help', - content=help_string_telegraph, - )["path"] - help_string = f''' -/{BotCommands.PingCommand}: Check how long it takes to Ping the Bot - -/{BotCommands.AuthorizeCommand}: Authorize a chat or a user to use the bot (Can only be invoked by Owner & Sudo of the bot) - -/{BotCommands.UnAuthorizeCommand}: Unauthorize a chat or a user to use the bot (Can only be invoked by Owner & Sudo of the bot) - -/{BotCommands.AuthorizedUsersCommand}: Show authorized users (Only Owner & Sudo) - -/{BotCommands.AddSudoCommand}: Add sudo user (Only Owner) - -/{BotCommands.RmSudoCommand}: Remove sudo users (Only Owner) - -/{BotCommands.RestartCommand}: Restart and update the bot - -/{BotCommands.LogCommand}: Get a log file of the bot. Handy for getting crash reports - -/{BotCommands.ShellCommand}: Run commands in Shell (Only Owner) - -/{BotCommands.ExecHelpCommand}: Get help for Executor module (Only Owner) +NOTE: Try each command without any perfix to see more detalis. +/{BotCommands.MirrorCommand[0]} or /{BotCommands.MirrorCommand[1]}: Start mirroring to Google Drive. +/{BotCommands.ZipMirrorCommand[0]} or /{BotCommands.ZipMirrorCommand[1]}: Start mirroring and upload the file/folder compressed with zip extension. +/{BotCommands.UnzipMirrorCommand[0]} or /{BotCommands.UnzipMirrorCommand[1]}: Start mirroring and upload the file/folder extracted from any archive extension. +/{BotCommands.QbMirrorCommand[0]} or /{BotCommands.QbMirrorCommand[1]}: Start Mirroring to Google Drive using qBittorrent. +/{BotCommands.QbZipMirrorCommand[0]} or /{BotCommands.QbZipMirrorCommand[1]}: Start mirroring using qBittorrent and upload the file/folder compressed with zip extension. +/{BotCommands.QbUnzipMirrorCommand[0]} or /{BotCommands.QbUnzipMirrorCommand[1]}: Start mirroring using qBittorrent and upload the file/folder extracted from any archive extension. +/{BotCommands.YtdlCommand[0]} or /{BotCommands.YtdlCommand[1]}: Mirror yt-dlp supported link. +/{BotCommands.YtdlZipCommand[0]} or /{BotCommands.YtdlZipCommand[1]}: Mirror yt-dlp supported link as zip. +/{BotCommands.LeechCommand[0]} or /{BotCommands.LeechCommand[1]}: Start leeching to Telegram. +/{BotCommands.ZipLeechCommand[0]} or /{BotCommands.ZipLeechCommand[1]}: Start leeching and upload the file/folder compressed with zip extension. +/{BotCommands.UnzipLeechCommand[0]} or /{BotCommands.UnzipLeechCommand[1]}: Start leeching and upload the file/folder extracted from any archive extension. +/{BotCommands.QbLeechCommand[0]} or /{BotCommands.QbLeechCommand[1]}: Start leeching using qBittorrent. +/{BotCommands.QbZipLeechCommand[0]} or /{BotCommands.QbZipLeechCommand[1]}: Start leeching using qBittorrent and upload the file/folder compressed with zip extension. +/{BotCommands.QbUnzipLeechCommand[0]} or /{BotCommands.QbUnzipLeechCommand[1]}: Start leeching using qBittorrent and upload the file/folder extracted from any archive extension. +/{BotCommands.YtdlLeechCommand[0]} or /{BotCommands.YtdlLeechCommand[1]}: Leech yt-dlp supported link. +/{BotCommands.YtdlZipLeechCommand[0]} or /{BotCommands.YtdlZipLeechCommand[1]}: Leech yt-dlp supported link as zip. +/{BotCommands.CloneCommand} [drive_url]: Copy file/folder to Google Drive. +/{BotCommands.CountCommand} [drive_url]: Count file/folder of Google Drive. +/{BotCommands.DeleteCommand} [drive_url]: Delete file/folder from Google Drive (Only Owner & Sudo). +/{BotCommands.LeechSetCommand} [query]: Leech settings. +/{BotCommands.SetThumbCommand}: Reply photo to set it as Thumbnail. +/{BotCommands.BtSelectCommand}: Select files from torrents by gid or reply. +/{BotCommands.RssListCommand[0]} or /{BotCommands.RssListCommand[1]}: List all subscribed rss feed info (Only Owner & Sudo). +/{BotCommands.RssGetCommand[0]} or /{BotCommands.RssGetCommand[1]}: Force fetch last N links (Only Owner & Sudo). +/{BotCommands.RssSubCommand[0]} or /{BotCommands.RssSubCommand[1]}: Subscribe new rss feed (Only Owner & Sudo). +/{BotCommands.RssUnSubCommand[0]} or /{BotCommands.RssUnSubCommand[1]}: Unubscribe rss feed by title (Only Owner & Sudo). +/{BotCommands.RssSettingsCommand[0]} or /{BotCommands.RssSettingsCommand[1]} [query]: Rss Settings (Only Owner & Sudo). +/{BotCommands.CancelMirror}: Cancel task by gid or reply. +/{BotCommands.CancelAllCommand} [query]: Cancel all [status] tasks. +/{BotCommands.ListCommand} [query]: Search in Google Drive(s). +/{BotCommands.SearchCommand} [query]: Search for torrents with API. +/{BotCommands.StatusCommand}: Shows a status of all the downloads. +/{BotCommands.StatsCommand}: Show stats of the machine where the bot is hosted in. +/{BotCommands.PingCommand}: Check how long it takes to Ping the Bot (Only Owner & Sudo). +/{BotCommands.AuthorizeCommand}: Authorize a chat or a user to use the bot (Only Owner & Sudo). +/{BotCommands.UnAuthorizeCommand}: Unauthorize a chat or a user to use the bot (Only Owner & Sudo). +/{BotCommands.AuthorizedUsersCommand}: Show authorized users (Only Owner & Sudo). +/{BotCommands.AddSudoCommand}: Add sudo user (Only Owner). +/{BotCommands.RmSudoCommand}: Remove sudo users (Only Owner). +/{BotCommands.RestartCommand}: Restart and update the bot (Only Owner & Sudo). +/{BotCommands.LogCommand}: Get a log file of the bot. Handy for getting crash reports (Only Owner & Sudo). +/{BotCommands.ShellCommand}: Run shell commands (Only Owner). +/{BotCommands.EvalCommand}: Run Python Code Line | Lines (Only Owner). +/{BotCommands.ExecCommand}: Run Commands In Exec (Only Owner). +/{BotCommands.ClearLocalsCommand}: Clear {BotCommands.EvalCommand} or {BotCommands.ExecCommand} locals (Only Owner). ''' def bot_help(update, context): - button = ButtonMaker() - button.buildbutton("Other Commands", f"https://telegra.ph/{help}") - reply_markup = InlineKeyboardMarkup(button.build_menu(1)) - sendMarkup(help_string, context.bot, update.message, reply_markup) - -botcmds = [ - - (f'{BotCommands.MirrorCommand}', 'Mirror'), - (f'{BotCommands.ZipMirrorCommand}','Mirror and upload as zip'), - (f'{BotCommands.UnzipMirrorCommand}','Mirror and extract files'), - (f'{BotCommands.QbMirrorCommand}','Mirror torrent using qBittorrent'), - (f'{BotCommands.QbZipMirrorCommand}','Mirror torrent and upload as zip using qb'), - (f'{BotCommands.QbUnzipMirrorCommand}','Mirror torrent and extract files using qb'), - (f'{BotCommands.WatchCommand}','Mirror yt-dlp supported link'), - (f'{BotCommands.ZipWatchCommand}','Mirror yt-dlp supported link as zip'), - (f'{BotCommands.CloneCommand}','Copy file/folder to Drive'), - (f'{BotCommands.LeechCommand}','Leech'), - (f'{BotCommands.ZipLeechCommand}','Leech and upload as zip'), - (f'{BotCommands.UnzipLeechCommand}','Leech and extract files'), - (f'{BotCommands.QbLeechCommand}','Leech torrent using qBittorrent'), - (f'{BotCommands.QbZipLeechCommand}','Leech torrent and upload as zip using qb'), - (f'{BotCommands.QbUnzipLeechCommand}','Leech torrent and extract using qb'), - (f'{BotCommands.LeechWatchCommand}','Leech yt-dlp supported link'), - (f'{BotCommands.LeechZipWatchCommand}','Leech yt-dlp supported link as zip'), - (f'{BotCommands.CountCommand}','Count file/folder of Drive'), - (f'{BotCommands.DeleteCommand}','Delete file/folder from Drive'), - (f'{BotCommands.CancelMirror}','Cancel a task'), - (f'{BotCommands.CancelAllCommand}','Cancel all downloading tasks'), - (f'{BotCommands.ListCommand}','Search in Drive'), - (f'{BotCommands.LeechSetCommand}','Leech settings'), - (f'{BotCommands.SetThumbCommand}','Set thumbnail'), - (f'{BotCommands.StatusCommand}','Get mirror status message'), - (f'{BotCommands.StatsCommand}','Bot usage stats'), - (f'{BotCommands.PingCommand}','Ping the bot'), - (f'{BotCommands.RestartCommand}','Restart the bot'), - (f'{BotCommands.LogCommand}','Get the bot Log'), - (f'{BotCommands.HelpCommand}','Get detailed help') - ] + sendMessage(help_string, context.bot, update.message) def main(): - # bot.set_my_commands(botcmds) start_cleanup() - notifier_dict = False if INCOMPLETE_TASK_NOTIFIER and DB_URI is not None: if notifier_dict := DbManger().get_incomplete_tasks(): for cid, data in notifier_dict.items(): diff --git a/bot/helper/ext_utils/bot_utils.py b/bot/helper/ext_utils/bot_utils.py index f972cf94..eb4460eb 100644 --- a/bot/helper/ext_utils/bot_utils.py +++ b/bot/helper/ext_utils/bot_utils.py @@ -127,35 +127,19 @@ def get_readable_message(): msg += f"\nStatus: {download.status()}" if download.status() not in [MirrorStatus.STATUS_SPLITTING, MirrorStatus.STATUS_SEEDING]: msg += f"\n{get_progress_bar_string(download)} {download.progress()}" - if download.status() in [MirrorStatus.STATUS_DOWNLOADING, - MirrorStatus.STATUS_WAITING, - MirrorStatus.STATUS_PAUSED]: - msg += f"\nDownloaded: {get_readable_file_size(download.processed_bytes())} of {download.size()}" - elif download.status() == MirrorStatus.STATUS_UPLOADING: - msg += f"\nUploaded: {get_readable_file_size(download.processed_bytes())} of {download.size()}" - elif download.status() == MirrorStatus.STATUS_CLONING: - msg += f"\nCloned: {get_readable_file_size(download.processed_bytes())} of {download.size()}" - elif download.status() == MirrorStatus.STATUS_ARCHIVING: - msg += f"\nArchived: {get_readable_file_size(download.processed_bytes())} of {download.size()}" - elif download.status() == MirrorStatus.STATUS_EXTRACTING: - msg += f"\nExtracted: {get_readable_file_size(download.processed_bytes())} of {download.size()}" + msg += f"\nProcessed: {get_readable_file_size(download.processed_bytes())} of {download.size()}" msg += f"\nSpeed: {download.speed()} | ETA: {download.eta()}" - try: - msg += f"\nSeeders: {download.aria_download().num_seeders}" \ - f" | Peers: {download.aria_download().connections}" - except: - pass - try: - msg += f"\nSeeders: {download.torrent_info().num_seeds}" \ - f" | Leechers: {download.torrent_info().num_leechs}" - except: - pass + if hasattr(download, 'seeders_num'): + try: + msg += f"\nSeeders: {download.seeders_num()} | Leechers: {download.leechers_num()}" + except: + pass elif download.status() == MirrorStatus.STATUS_SEEDING: msg += f"\nSize: {download.size()}" - msg += f"\nSpeed: {get_readable_file_size(download.torrent_info().upspeed)}/s" - msg += f" | Uploaded: {get_readable_file_size(download.torrent_info().uploaded)}" - msg += f"\nRatio: {round(download.torrent_info().ratio, 3)}" - msg += f" | Time: {get_readable_time(download.torrent_info().seeding_time)}" + msg += f"\nSpeed: {download.upload_speed()}" + msg += f" | Uploaded: {download.uploaded_bytes()}" + msg += f"\nRatio: {download.ratio()}" + msg += f" | Time: {download.seeding_time()}" else: msg += f"\nSize: {download.size()}" msg += f"\n/{BotCommands.CancelMirror} {download.gid()}" @@ -180,6 +164,12 @@ def get_readable_message(): upspeed_bytes += float(spd.split('K')[0]) * 1024 elif 'MB/s' in spd: upspeed_bytes += float(spd.split('M')[0]) * 1048576 + elif download.status() == MirrorStatus.STATUS_SEEDING: + spd = download.upload_speed() + if 'K' in spd: + upspeed_bytes += float(spd.split('K')[0]) * 1024 + elif 'M' in spd: + upspeed_bytes += float(spd.split('M')[0]) * 1048576 bmsg += f"\nDL: {get_readable_file_size(dlspeed_bytes)}/s | UL: {get_readable_file_size(upspeed_bytes)}/s" if STATUS_LIMIT is not None and tasks > STATUS_LIMIT: msg += f"Page: {PAGE_NO}/{pages} | Tasks: {tasks}\n" diff --git a/bot/helper/ext_utils/fs_utils.py b/bot/helper/ext_utils/fs_utils.py index c75e5da8..365f2be1 100644 --- a/bot/helper/ext_utils/fs_utils.py +++ b/bot/helper/ext_utils/fs_utils.py @@ -10,12 +10,7 @@ from math import ceil from re import split as re_split, I from .exceptions import NotSupportedExtractionArchive -from bot import aria2, app, LOGGER, DOWNLOAD_DIR, get_client, LEECH_SPLIT_SIZE, EQUAL_SPLITS, IS_PREMIUM_USER - -if IS_PREMIUM_USER: - MAX_SPLIT_SIZE = 4194304000 -else: - MAX_SPLIT_SIZE = 2097152000 +from bot import aria2, app, LOGGER, DOWNLOAD_DIR, get_client, LEECH_SPLIT_SIZE, EQUAL_SPLITS, IS_PREMIUM_USER, MAX_SPLIT_SIZE VIDEO_SUFFIXES = ("M4V", "MP4", "MOV", "FLV", "WMV", "3GP", "MPG", "WEBM", "MKV", "AVI") @@ -24,6 +19,19 @@ ARCH_EXT = [".tar.bz2", ".tar.gz", ".bz2", ".gz", ".tar.xz", ".tar", ".tbz2", ". ".cpio", ".cramfs", ".deb", ".dmg", ".fat", ".hfs", ".lzh", ".lzma", ".mbr", ".msi", ".mslz", ".nsis", ".ntfs", ".rpm", ".squashfs", ".udf", ".vhd", ".xar"] +def clean_target(path: str): + if ospath.exists(path): + LOGGER.info(f"Cleaning Target: {path}") + if ospath.isdir(path): + try: + rmtree(path) + except: + pass + elif ospath.isfile(path): + try: + osremove(path) + except: + pass def clean_download(path: str): if ospath.exists(path): @@ -62,11 +70,10 @@ def clean_unwanted(path: str): LOGGER.info(f"Cleaning unwanted files/folders: {path}") for dirpath, subdir, files in walk(path, topdown=False): for filee in files: - if filee.endswith((".!qB", ".aria2")) or filee.endswith('.parts') and filee.startswith('.'): + if filee.endswith(".!qB") or filee.endswith('.parts') and filee.startswith('.'): osremove(ospath.join(dirpath, filee)) - for folder in subdir: - if folder == ".unwanted": - rmtree(ospath.join(dirpath, folder)) + if dirpath.endswith((".unwanted", "splited_files_mltb")): + rmtree(dirpath) for dirpath, subdir, files in walk(path, topdown=False): if not listdir(dirpath): rmdir(dirpath) @@ -117,6 +124,9 @@ def take_ss(video_file): return des_dir def split_file(path, size, file_, dirpath, split_size, listener, start_time=0, i=1, inLoop=False, noMap=False): + if listener.seed and not listener.newDir: + dirpath = f"{dirpath}/splited_files_mltb" + mkdir(dirpath) parts = ceil(size/LEECH_SPLIT_SIZE) if EQUAL_SPLITS and not inLoop: split_size = ceil(size/parts) + 1000 @@ -129,20 +139,29 @@ def split_file(path, size, file_, dirpath, split_size, listener, start_time=0, i out_path = ospath.join(dirpath, parted_name) if not noMap: listener.suproc = Popen(["ffmpeg", "-hide_banner", "-loglevel", "error", "-ss", str(start_time), - "-i", path, "-fs", str(split_size), "-map", "0", "-map_chapters", "-1", "-c", "copy", out_path]) + "-i", path, "-fs", str(split_size), "-map", "0", "-map_chapters", "-1", + "-c", "copy", out_path]) else: listener.suproc = Popen(["ffmpeg", "-hide_banner", "-loglevel", "error", "-ss", str(start_time), - "-i", path, "-fs", str(split_size), "-map_chapters", "-1", "-c", "copy", out_path]) + "-i", path, "-fs", str(split_size), "-map_chapters", "-1", "-c", "copy", + out_path]) listener.suproc.wait() if listener.suproc.returncode == -9: return False elif listener.suproc.returncode != 0 and not noMap: - LOGGER.warning(f'Retrying without map, -map 0 not working in all situations. Path: {path}') + LOGGER.warning(f"Retrying without map, -map 0 not working in all situations. Path: {path}") try: osremove(out_path) except: pass return split_file(path, size, file_, dirpath, split_size, listener, start_time, i, True, True) + elif listener.suproc.returncode != 0: + LOGGER.warning(f"Unable to split this video, if it's size less than {MAX_SPLIT_SIZE} will be uploaded as it is. Path: {path}") + try: + osremove(out_path) + except: + pass + return "errored" out_size = get_path_size(out_path) if out_size > MAX_SPLIT_SIZE: dif = out_size - MAX_SPLIT_SIZE @@ -163,7 +182,8 @@ def split_file(path, size, file_, dirpath, split_size, listener, start_time=0, i i = i + 1 else: out_path = ospath.join(dirpath, file_ + ".") - listener.suproc = Popen(["split", "--numeric-suffixes=1", "--suffix-length=3", f"--bytes={split_size}", path, out_path]) + listener.suproc = Popen(["split", "--numeric-suffixes=1", "--suffix-length=3", + f"--bytes={split_size}", path, out_path]) listener.suproc.wait() if listener.suproc.returncode == -9: return False diff --git a/bot/helper/ext_utils/html_helper.py b/bot/helper/ext_utils/html_helper.py new file mode 100644 index 00000000..0a2c4e8b --- /dev/null +++ b/bot/helper/ext_utils/html_helper.py @@ -0,0 +1,125 @@ +hmtl_content = """ + + + + + + {fileName} + + + + + + + +{msg} + + +""" + +html_template = """ + + + + + + {title} + + + + + + + +{msg} + + +""" diff --git a/bot/helper/ext_utils/telegraph_helper.py b/bot/helper/ext_utils/telegraph_helper.py deleted file mode 100644 index 01a7c016..00000000 --- a/bot/helper/ext_utils/telegraph_helper.py +++ /dev/null @@ -1,81 +0,0 @@ -# Implement By - @VarnaX-279 - -from string import ascii_letters -from random import SystemRandom - -from time import sleep -from telegraph import Telegraph -from telegraph.exceptions import RetryAfterError - -from bot import LOGGER - - -class TelegraphHelper: - def __init__(self, author_name=None, author_url=None): - self.telegraph = Telegraph() - self.short_name = ''.join(SystemRandom().choices(ascii_letters, k=8)) - self.access_token = None - self.author_name = author_name - self.author_url = author_url - self.create_account() - - def create_account(self): - self.telegraph.create_account( - short_name=self.short_name, - author_name=self.author_name, - author_url=self.author_url - ) - self.access_token = self.telegraph.get_access_token() - LOGGER.info("Creating Telegraph Account") - - def create_page(self, title, content): - try: - return self.telegraph.create_page( - title = title, - author_name=self.author_name, - author_url=self.author_url, - html_content=content - ) - except RetryAfterError as st: - LOGGER.warning(f'Telegraph Flood control exceeded. I will sleep for {st.retry_after} seconds.') - sleep(st.retry_after) - return self.create_page(title, content) - - def edit_page(self, path, title, content): - try: - return self.telegraph.edit_page( - path = path, - title = title, - author_name=self.author_name, - author_url=self.author_url, - html_content=content - ) - except RetryAfterError as st: - LOGGER.warning(f'Telegraph Flood control exceeded. I will sleep for {st.retry_after} seconds.') - sleep(st.retry_after) - return self.edit_page(path, title, content) - - def edit_telegraph(self, path, telegraph_content): - nxt_page = 1 - prev_page = 0 - num_of_path = len(path) - for content in telegraph_content : - if nxt_page == 1 : - content += f'Next' - nxt_page += 1 - else : - if prev_page <= num_of_path: - content += f'Prev' - prev_page += 1 - if nxt_page < num_of_path: - content += f' | Next' - nxt_page += 1 - self.edit_page( - path = path[prev_page], - title = 'Mirror-leech-bot Torrent Search', - content=content - ) - return - - -telegraph=TelegraphHelper('Mirror-Leech-Telegram-Bot', 'https://github.com/anasty17/mirror-leech-telegram-bot') diff --git a/bot/helper/mirror_utils/download_utils/aria2_download.py b/bot/helper/mirror_utils/download_utils/aria2_download.py index 17b917f9..11b3edf4 100644 --- a/bot/helper/mirror_utils/download_utils/aria2_download.py +++ b/bot/helper/mirror_utils/download_utils/aria2_download.py @@ -1,10 +1,10 @@ -from time import sleep +from time import sleep, time from bot import aria2, download_dict_lock, download_dict, STOP_DUPLICATE, BASE_URL, LOGGER from bot.helper.mirror_utils.upload_utils.gdriveTools import GoogleDriveHelper from bot.helper.ext_utils.bot_utils import is_magnet, getDownloadByGid, new_thread, bt_selection_buttons from bot.helper.mirror_utils.status_utils.aria_download_status import AriaDownloadStatus -from bot.helper.telegram_helper.message_utils import sendMarkup, sendStatusMessage, sendMessage, deleteMessage +from bot.helper.telegram_helper.message_utils import sendMarkup, sendStatusMessage, sendMessage, deleteMessage, update_all_messages from bot.helper.ext_utils.fs_utils import get_base_name, clean_unwanted @@ -12,34 +12,43 @@ from bot.helper.ext_utils.fs_utils import get_base_name, clean_unwanted def __onDownloadStarted(api, gid): download = api.get_download(gid) if download.is_metadata: - LOGGER.info(f'onDownloadStarted: {gid} Metadata') + LOGGER.info(f'onDownloadStarted: {gid} METADATA') + sleep(1) dl = getDownloadByGid(gid) - if dl.listener().select: + listener = dl.listener() + if listener.select: metamsg = "Downloading Metadata, wait then you can select files. Use torrent file to avoid this wait." - meta = sendMessage(metamsg, dl.listener().bot, dl.listener().message) + meta = sendMessage(metamsg, listener.bot, listener.message) while True: - download = api.get_download(gid) - if download.followed_by_ids: - deleteMessage(dl.listener().bot, meta) + try: + download = api.get_download(gid) + except: + deleteMessage(listener.bot, meta) + break + if download.followed_by_ids: + deleteMessage(listener.bot, meta) break - sleep(1) return else: - LOGGER.info(f'onDownloadStarted: {gid}') + LOGGER.info(f'onDownloadStarted: {download.name} - Gid: {gid}') try: if STOP_DUPLICATE: + sleep(1) + dl = getDownloadByGid(gid) + if not dl: + return + listener = dl.listener() + if listener.isLeech or listener.select: + return download = api.get_download(gid) if not download.is_torrent: sleep(3) - download = api.get_download(gid) - dl = getDownloadByGid(gid) - if not dl or dl.listener().isLeech: - return + download = download.live LOGGER.info('Checking File/Folder if already in Drive...') sname = download.name - if dl.listener().isZip: + if listener.isZip: sname = sname + ".zip" - elif dl.listener().extract: + elif listener.extract: try: sname = get_base_name(sname) except: @@ -47,30 +56,76 @@ def __onDownloadStarted(api, gid): if sname is not None: smsg, button = GoogleDriveHelper().drive_list(sname, True) if smsg: - dl.listener().onDownloadError('File/Folder already available in Drive.\n\n') + listener.onDownloadError('File/Folder already available in Drive.\n\n') api.remove([download], force=True, files=True) - return sendMarkup("Here are the search results:", dl.listener().bot, dl.listener().message, button) + return sendMarkup("Here are the search results:", listener.bot, listener.message, button) except Exception as e: LOGGER.error(f"{e} onDownloadStart: {gid} check duplicate didn't pass") @new_thread def __onDownloadComplete(api, gid): - download = api.get_download(gid) + try: + download = api.get_download(gid) + except: + return if download.followed_by_ids: new_gid = download.followed_by_ids[0] LOGGER.info(f'Gid changed from {gid} to {new_gid}') - if BASE_URL is not None: - dl = getDownloadByGid(new_gid) - if dl and dl.listener().select: - api.client.force_pause(new_gid) - SBUTTONS = bt_selection_buttons(new_gid) - msg = "Your download paused. Choose files then press Done Selecting button to start downloading." - sendMarkup(msg, dl.listener().bot, dl.listener().message, SBUTTONS) - elif dl := getDownloadByGid(gid): - LOGGER.info(f"onDownloadComplete: {gid}") - if dl.listener().select: - clean_unwanted(dl.path()) - dl.listener().onDownloadComplete() + dl = getDownloadByGid(new_gid) + listener = dl.listener() + if BASE_URL is not None and listener.select: + SBUTTONS = bt_selection_buttons(new_gid) + msg = "Your download paused. Choose files then press Done Selecting button to start downloading." + sendMarkup(msg, listener.bot, listener.message, SBUTTONS) + elif download.is_torrent: + if dl := getDownloadByGid(gid): + if hasattr(dl, 'listener'): + listener = dl.listener() + if hasattr(listener, 'uploaded'): + LOGGER.info(f"Cancelling Seed: {download.name} onDownloadComplete") + listener.onUploadError(f"Seeding stopped with Ratio: {dl.ratio()} and Time: {dl.seeding_time()}") + api.remove([download], force=True, files=True) + else: + LOGGER.info(f"onDownloadComplete: {download.name} - Gid: {gid}") + if dl := getDownloadByGid(gid): + dl.listener().onDownloadComplete() + api.remove([download], force=True, files=True) + +@new_thread +def __onBtDownloadComplete(api, gid): + seed_start_time = time() + sleep(1) + download = api.get_download(gid) + LOGGER.info(f"onBtDownloadComplete: {download.name} - Gid: {gid}") + if dl := getDownloadByGid(gid): + listener = dl.listener() + if listener.select: + clean_unwanted(download.dir) + if listener.seed: + try: + api.set_options({'max-upload-limit': '0'}, [download]) + except Exception as e: + LOGGER.error(f'{e} You are not able to seed because you added global option seed-time=0 without adding specific seed_time for this torrent') + listener.onDownloadComplete() + if listener.seed: + with download_dict_lock: + if listener.uid not in download_dict: + api.remove([download], force=True, files=True) + return + download_dict[listener.uid] = AriaDownloadStatus(gid, listener) + download_dict[listener.uid].start_time = seed_start_time + LOGGER.info(f"Seeding started: {download.name} - Gid: {gid}") + download = download.live + if download.is_complete: + if dl := getDownloadByGid(gid): + LOGGER.info(f"Cancelling Seed: {download.name}") + listener.onUploadError(f"Seeding stopped with Ratio: {dl.ratio()} and Time: {dl.seeding_time()}") + api.remove([download], force=True, files=True) + else: + listener.uploaded = True + update_all_messages() + else: + api.remove([download], force=True, files=True) @new_thread def __onDownloadStopped(api, gid): @@ -81,6 +136,7 @@ def __onDownloadStopped(api, gid): @new_thread def __onDownloadError(api, gid): LOGGER.info(f"onDownloadError: {gid}") + error = "None" try: download = api.get_download(gid) error = download.error_message @@ -96,20 +152,30 @@ def start_listener(): on_download_error=__onDownloadError, on_download_stop=__onDownloadStopped, on_download_complete=__onDownloadComplete, - timeout=30) + on_bt_download_complete=__onBtDownloadComplete, + timeout=60) -def add_aria2c_download(link: str, path, listener, filename, auth, select): +def add_aria2c_download(link: str, path, listener, filename, auth, select, ratio, seed_time): + args = {'dir': path, 'max-upload-limit': '1K'} + if filename: + args['out'] = filename + if auth: + args['header'] = f"authorization: {auth}" + if ratio: + args['seed-ratio'] = ratio + if seed_time: + args['seed-time'] = seed_time if is_magnet(link): - download = aria2.add_magnet(link, {'dir': path}) + download = aria2.add_magnet(link, args) else: - download = aria2.add_uris([link], {'dir': path, 'out': filename, 'header': f"authorization: {auth}"}) + download = aria2.add_uris([link], args) if download.error_message: error = str(download.error_message).replace('<', ' ').replace('>', ' ') LOGGER.info(f"Download Error: {error}") return sendMessage(error, listener.bot, listener.message) with download_dict_lock: download_dict[listener.uid] = AriaDownloadStatus(download.gid, listener) - LOGGER.info(f"Started: {download.gid} DIR: {download.dir} ") + LOGGER.info(f"Aria2Download started: {download.gid}") listener.onDownloadStart() if not select: sendStatusMessage(listener.message, listener.bot) diff --git a/bot/helper/mirror_utils/download_utils/direct_link_generator.py b/bot/helper/mirror_utils/download_utils/direct_link_generator.py index 0c9a1cf8..46a767ec 100644 --- a/bot/helper/mirror_utils/download_utils/direct_link_generator.py +++ b/bot/helper/mirror_utils/download_utils/direct_link_generator.py @@ -28,7 +28,7 @@ fmed_list = ['fembed.net', 'fembed.com', 'femax20.com', 'fcdn.stream', 'feurl.co def direct_link_generator(link: str): """ direct links generator """ if 'youtube.com' in link or 'youtu.be' in link: - raise DirectDownloadLinkException(f"ERROR: Use watch cmds for Youtube links") + raise DirectDownloadLinkException(f"ERROR: Use ytdl cmds for Youtube links") elif 'yadi.sk' in link or 'disk.yandex.com' in link: return yandex_disk(link) elif 'mediafire.com' in link: diff --git a/bot/helper/mirror_utils/download_utils/gd_downloader.py b/bot/helper/mirror_utils/download_utils/gd_downloader.py index 9eb519cb..e139f3a0 100644 --- a/bot/helper/mirror_utils/download_utils/gd_downloader.py +++ b/bot/helper/mirror_utils/download_utils/gd_downloader.py @@ -8,7 +8,7 @@ from bot.helper.telegram_helper.message_utils import sendMessage, sendStatusMess from bot.helper.ext_utils.fs_utils import get_base_name -def add_gd_download(link, listener, newname): +def add_gd_download(link, path, listener, newname): res, size, name, files = GoogleDriveHelper().helper(link) if res != "": return sendMessage(res, listener.bot, listener.message) @@ -29,7 +29,7 @@ def add_gd_download(link, listener, newname): msg = "File/Folder is already available in Drive.\nHere are the search results:" return sendMarkup(msg, listener.bot, listener.message, button) LOGGER.info(f"Download Name: {name}") - drive = GoogleDriveHelper(name, listener) + drive = GoogleDriveHelper(name, path, size, listener) gid = ''.join(SystemRandom().choices(ascii_letters + digits, k=12)) download_status = GdDownloadStatus(drive, size, listener, gid) with download_dict_lock: diff --git a/bot/helper/mirror_utils/download_utils/mega_downloader.py b/bot/helper/mirror_utils/download_utils/mega_downloader.py index dc43ffd5..9b17066f 100644 --- a/bot/helper/mirror_utils/download_utils/mega_downloader.py +++ b/bot/helper/mirror_utils/download_utils/mega_downloader.py @@ -76,7 +76,7 @@ class MegaAppListener(MegaListener): LOGGER.error(f'Mega Request error in {error}') if not self.is_cancelled: self.is_cancelled = True - self.listener.onDownloadError("RequestTempError: " + error.toString()) + self.listener.onDownloadError(f"RequestTempError: {error.toString()}") self.error = error.toString() self.continue_event.set() diff --git a/bot/helper/mirror_utils/download_utils/qbit_downloader.py b/bot/helper/mirror_utils/download_utils/qbit_downloader.py index 4acd8dbf..b4c0a31e 100644 --- a/bot/helper/mirror_utils/download_utils/qbit_downloader.py +++ b/bot/helper/mirror_utils/download_utils/qbit_downloader.py @@ -30,7 +30,7 @@ class QbDownloader: self.__dupChecked = False self.__rechecked = False - def add_qb_torrent(self, link, path, select): + def add_qb_torrent(self, link, path, select, ratio, seed_time): self.__path = path self.select = select self.client = get_client() @@ -44,9 +44,9 @@ class QbDownloader: sendMessage("This Torrent already added!", self.__listener.bot, self.__listener.message) return self.client.auth_log_out() if link.startswith('magnet:'): - op = self.client.torrents_add(link, save_path=path) + op = self.client.torrents_add(link, save_path=path, ratio_limit=ratio, seeding_time_limit=seed_time) else: - op = self.client.torrents_add(torrent_files=[link], save_path=path) + op = self.client.torrents_add(torrent_files=[link], save_path=path, ratio_limit=ratio, seeding_time_limit=seed_time) sleep(0.3) if op.lower() == "ok.": tor_info = self.client.torrents_info(torrent_hashes=self.ext_hash) @@ -109,7 +109,7 @@ class QbDownloader: self.__onDownloadError("Dead Torrent!") elif tor_info.state == "downloading": self.__stalled_time = time() - if not self.__dupChecked and STOP_DUPLICATE and ospath.isdir(f'{self.__path}') and not self.__listener.isLeech and not self.select: + if not self.select and not self.__dupChecked and STOP_DUPLICATE and not self.__listener.isLeech and ospath.isdir(f'{self.__path}'): LOGGER.info('Checking File/Folder if already in Drive') qbname = str(listdir(f'{self.__path}')[-1]) if qbname.endswith('.!qB'): @@ -149,26 +149,20 @@ class QbDownloader: if self.select: clean_unwanted(self.__path) self.__listener.onDownloadComplete() - if self.__listener.seed and not self.__listener.isLeech and not self.__listener.extract: + if self.__listener.seed: with download_dict_lock: if self.__listener.uid not in download_dict: - self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True) - self.client.auth_log_out() - self.__periodic.cancel() + self.__remove_torrent() return download_dict[self.__listener.uid] = QbDownloadStatus(self.__listener, self) self.is_seeding = True update_all_messages() - LOGGER.info(f"Seeding started: {self.__name}") + LOGGER.info(f"Seeding started: {self.__name} - Hash: {self.ext_hash}") else: - self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True) - self.client.auth_log_out() - self.__periodic.cancel() + self.__remove_torrent() elif tor_info.state == 'pausedUP' and self.__listener.seed: self.__listener.onUploadError(f"Seeding stopped with Ratio: {round(tor_info.ratio, 3)} and Time: {get_readable_time(tor_info.seeding_time)}") - self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True) - self.client.auth_log_out() - self.__periodic.cancel() + self.__remove_torrent() except Exception as e: LOGGER.error(str(e)) @@ -177,6 +171,9 @@ class QbDownloader: self.client.torrents_pause(torrent_hashes=self.ext_hash) sleep(0.3) self.__listener.onDownloadError(err) + self.__remove_torrent() + + def __remove_torrent(self): self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True) self.client.auth_log_out() self.__periodic.cancel() diff --git a/bot/helper/mirror_utils/download_utils/youtube_dl_download_helper.py b/bot/helper/mirror_utils/download_utils/youtube_dl_download_helper.py index d9b98073..b5c4697a 100644 --- a/bot/helper/mirror_utils/download_utils/youtube_dl_download_helper.py +++ b/bot/helper/mirror_utils/download_utils/youtube_dl_download_helper.py @@ -173,8 +173,8 @@ class YoutubeDLHelper: if len(audio_info) == 2: rate = audio_info[1] else: - rate = 320 - self.opts['postprocessors'] = [{'key': 'FFmpegExtractAudio','preferredcodec': 'mp3','preferredquality': f'{rate}'}] + rate = '320' + self.opts['postprocessors'] = [{'key': 'FFmpegExtractAudio', 'preferredcodec': 'mp3', 'preferredquality': rate}] self.opts['format'] = qual LOGGER.info(f"Downloading with YT-DLP: {link}") self.extractMetaData(link, name, args) diff --git a/bot/helper/mirror_utils/status_utils/aria_download_status.py b/bot/helper/mirror_utils/status_utils/aria_download_status.py index 99372207..627fa420 100644 --- a/bot/helper/mirror_utils/status_utils/aria_download_status.py +++ b/bot/helper/mirror_utils/status_utils/aria_download_status.py @@ -1,5 +1,7 @@ -from bot import aria2, DOWNLOAD_DIR, LOGGER -from bot.helper.ext_utils.bot_utils import MirrorStatus +from time import time + +from bot import aria2, LOGGER +from bot.helper.ext_utils.bot_utils import MirrorStatus, get_readable_time def get_download(gid): try: @@ -14,15 +16,14 @@ class AriaDownloadStatus: self.__gid = gid self.__download = get_download(gid) self.__listener = listener + self.start_time = 0 self.message = listener.message - def path(self): - return f'{DOWNLOAD_DIR}{self.__listener.uid}' - def __update(self): - self.__download = get_download(self.__gid) + self.__download = self.__download.live if self.__download.followed_by_ids: self.__gid = self.__download.followed_by_ids[0] + self.__download = get_download(self.__gid) def progress(self): """ @@ -61,11 +62,28 @@ class AriaDownloadStatus: return MirrorStatus.STATUS_WAITING elif download.is_paused: return MirrorStatus.STATUS_PAUSED + elif download.seeder and hasattr(self.__listener, 'uploaded'): + return MirrorStatus.STATUS_SEEDING else: return MirrorStatus.STATUS_DOWNLOADING - def aria_download(self): - return self.__download + def seeders_num(self): + return self.__download.num_seeders + + def leechers_num(self): + return self.__download.connections + + def uploaded_bytes(self): + return self.__download.upload_length_string() + + def upload_speed(self): + return self.__download.upload_speed_string() + + def ratio(self): + return f"{round(self.__download.upload_length / self.__download.completed_length, 3)}" + + def seeding_time(self): + return f"{get_readable_time(time() - self.start_time)}" def download(self): return self @@ -78,18 +96,17 @@ class AriaDownloadStatus: return self.__gid def cancel_download(self): - LOGGER.info(f"Cancelling Download: {self.name()}") self.__update() - download = self.__download - if download.is_waiting: - self.__listener.onDownloadError("Cancelled by user") - aria2.remove([download], force=True, files=True) - return - if len(download.followed_by_ids) != 0: - downloads = aria2.get_downloads(download.followed_by_ids) + if self.__download.seeder: + LOGGER.info(f"Cancelling Seed: {self.name}") + self.__listener.onUploadError(f"Seeding stopped with Ratio: {self.ratio()} and Time: {self.seeding_time()}") + aria2.remove([self.__download], force=True, files=True) + elif len(self.__download.followed_by_ids) != 0: + LOGGER.info(f"Cancelling Download: {self.name()}") + downloads = aria2.get_downloads(self.__download.followed_by_ids) self.__listener.onDownloadError('Download stopped by user!') aria2.remove(downloads, force=True, files=True) - aria2.remove([download], force=True, files=True) - return - self.__listener.onDownloadError('Download stopped by user!') - aria2.remove([download], force=True, files=True) + else: + LOGGER.info(f"Cancelling Download: {self.name()}") + self.__listener.onDownloadError('Download stopped by user!') + aria2.remove([self.__download], force=True, files=True) diff --git a/bot/helper/mirror_utils/status_utils/extract_status.py b/bot/helper/mirror_utils/status_utils/extract_status.py index 0463e583..653c11e8 100644 --- a/bot/helper/mirror_utils/status_utils/extract_status.py +++ b/bot/helper/mirror_utils/status_utils/extract_status.py @@ -53,7 +53,10 @@ class ExtractStatus: return MirrorStatus.STATUS_EXTRACTING def processed_bytes(self): - return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size + if self.__listener.newDir: + return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}10000") + else: + return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size def download(self): return self diff --git a/bot/helper/mirror_utils/status_utils/gd_download_status.py b/bot/helper/mirror_utils/status_utils/gd_download_status.py index 448220f0..914669a0 100644 --- a/bot/helper/mirror_utils/status_utils/gd_download_status.py +++ b/bot/helper/mirror_utils/status_utils/gd_download_status.py @@ -9,7 +9,7 @@ class GdDownloadStatus: self.message = listener.message def processed_bytes(self): - return self.__obj.downloaded_bytes + return self.__obj.processed_bytes def size_raw(self): return self.__size @@ -28,7 +28,7 @@ class GdDownloadStatus: def progress_raw(self): try: - return self.__obj.downloaded_bytes / self.__size * 100 + return self.__obj.processed_bytes / self.__size * 100 except: return 0 @@ -39,14 +39,14 @@ class GdDownloadStatus: """ :return: Download speed in Bytes/Seconds """ - return self.__obj.dspeed() + return self.__obj.speed() def speed(self): return f'{get_readable_file_size(self.speed_raw())}/s' def eta(self): try: - seconds = (self.__size - self.__obj.downloaded_bytes) / self.speed_raw() + seconds = (self.__size - self.__obj.processed_bytes) / self.speed_raw() return f'{get_readable_time(seconds)}' except: return '-' diff --git a/bot/helper/mirror_utils/status_utils/qbit_download_status.py b/bot/helper/mirror_utils/status_utils/qbit_download_status.py index f0af5164..e4cabf54 100644 --- a/bot/helper/mirror_utils/status_utils/qbit_download_status.py +++ b/bot/helper/mirror_utils/status_utils/qbit_download_status.py @@ -70,8 +70,23 @@ class QbDownloadStatus: else: return MirrorStatus.STATUS_DOWNLOADING - def torrent_info(self): - return self.__info + def seeders_num(self): + return self.__info.num_seeds + + def leechers_num(self): + return self.__info.num_leechs + + def uploaded_bytes(self): + return f"{get_readable_file_size(self.__info.uploaded)}" + + def upload_speed(self): + return f"{get_readable_file_size(self.__info.upspeed)}/s" + + def ratio(self): + return f"{round(self.__info.ratio, 3)}" + + def seeding_time(self): + return f"{get_readable_time(self.__info.seeding_time)}" def download(self): return self.__obj diff --git a/bot/helper/mirror_utils/status_utils/upload_status.py b/bot/helper/mirror_utils/status_utils/upload_status.py index 87edecbc..b9c2422b 100644 --- a/bot/helper/mirror_utils/status_utils/upload_status.py +++ b/bot/helper/mirror_utils/status_utils/upload_status.py @@ -9,7 +9,7 @@ class UploadStatus: self.message = listener.message def processed_bytes(self): - return self.__obj.uploaded_bytes + return self.__obj.processed_bytes def size_raw(self): return self.__size @@ -25,7 +25,7 @@ class UploadStatus: def progress_raw(self): try: - return self.__obj.uploaded_bytes / self.__size * 100 + return self.__obj.processed_bytes / self.__size * 100 except ZeroDivisionError: return 0 @@ -43,7 +43,7 @@ class UploadStatus: def eta(self): try: - seconds = (self.__size - self.__obj.uploaded_bytes) / self.speed_raw() + seconds = (self.__size - self.__obj.processed_bytes) / self.speed_raw() return f'{get_readable_time(seconds)}' except ZeroDivisionError: return '-' diff --git a/bot/helper/mirror_utils/status_utils/zip_status.py b/bot/helper/mirror_utils/status_utils/zip_status.py index 2b1cb638..b60886e4 100644 --- a/bot/helper/mirror_utils/status_utils/zip_status.py +++ b/bot/helper/mirror_utils/status_utils/zip_status.py @@ -53,7 +53,10 @@ class ZipStatus: return MirrorStatus.STATUS_ARCHIVING def processed_bytes(self): - return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size + if self.__listener.newDir: + return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}10000") + else: + return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size def download(self): return self diff --git a/bot/helper/mirror_utils/upload_utils/gdriveTools.py b/bot/helper/mirror_utils/upload_utils/gdriveTools.py index 018398be..46b976fd 100644 --- a/bot/helper/mirror_utils/upload_utils/gdriveTools.py +++ b/bot/helper/mirror_utils/upload_utils/gdriveTools.py @@ -2,7 +2,7 @@ from logging import getLogger, ERROR from time import time from pickle import load as pload from json import loads as jsnloads -from os import makedirs, path as ospath, listdir +from os import makedirs, path as ospath, listdir, remove as osremove from requests.utils import quote as rquote from io import FileIO from re import search as re_search @@ -16,11 +16,10 @@ from telegram import InlineKeyboardMarkup from tenacity import retry, wait_exponential, stop_after_attempt, retry_if_exception_type, RetryError from bot.helper.telegram_helper.button_build import ButtonMaker -from bot import parent_id, DOWNLOAD_DIR, IS_TEAM_DRIVE, INDEX_URL, USE_SERVICE_ACCOUNTS, VIEW_LINK, \ +from bot import parent_id, IS_TEAM_DRIVE, INDEX_URL, USE_SERVICE_ACCOUNTS, VIEW_LINK, \ DRIVES_NAMES, DRIVES_IDS, INDEX_URLS, EXTENSION_FILTER -from bot.helper.ext_utils.telegraph_helper import telegraph from bot.helper.ext_utils.bot_utils import get_readable_file_size, setInterval -from bot.helper.ext_utils.fs_utils import get_mime_type, get_path_size +from bot.helper.ext_utils.fs_utils import get_mime_type LOGGER = getLogger(__name__) getLogger('googleapiclient.discovery').setLevel(ERROR) @@ -31,40 +30,36 @@ if USE_SERVICE_ACCOUNTS: class GoogleDriveHelper: - def __init__(self, name=None, listener=None): + def __init__(self, name=None, path=None, size=0, listener=None): self.__G_DRIVE_TOKEN_FILE = "token.pickle" - # Check https://developers.google.com/drive/scopes for all available scopes self.__OAUTH_SCOPE = ['https://www.googleapis.com/auth/drive'] - # Redirect URI for installed apps, can be left as is - self.__REDIRECT_URI = "urn:ietf:wg:oauth:2.0:oob" self.__G_DRIVE_DIR_MIME_TYPE = "application/vnd.google-apps.folder" self.__G_DRIVE_BASE_DOWNLOAD_URL = "https://drive.google.com/uc?id={}&export=download" self.__G_DRIVE_DIR_BASE_DOWNLOAD_URL = "https://drive.google.com/drive/folders/{}" self.__listener = listener + self.__path = path self.__service = self.__authorize() - self._file_uploaded_bytes = 0 - self._file_downloaded_bytes = 0 - self.uploaded_bytes = 0 - self.downloaded_bytes = 0 - self.start_time = 0 - self.total_time = 0 - self.dtotal_time = 0 - self.is_uploading = False - self.is_downloading = False - self.is_cloning = False - self.is_cancelled = False - self.is_errored = False - self.status = None - self.dstatus = None - self.updater = None - self.name = name - self.update_interval = 3 self.__total_bytes = 0 self.__total_files = 0 self.__total_folders = 0 - self.transferred_size = 0 self.__sa_count = 0 - self.alt_auth = False + self.__start_time = 0 + self.__total_time = 0 + self.__alt_auth = False + self.__is_uploading = False + self.__is_downloading = False + self.__is_cloning = False + self.__is_cancelled = False + self.__is_errored = False + self.__status = None + self.__updater = None + self.__update_interval = 3 + self.__size = size + self._file_processed_bytes = 0 + self.name = name + self.processed_bytes = 0 + self.transferred_size = 0 + def speed(self): """ @@ -72,19 +67,13 @@ class GoogleDriveHelper: :return: Upload speed in bytes/second """ try: - return self.uploaded_bytes / self.total_time - except: - return 0 - - def dspeed(self): - try: - return self.downloaded_bytes / self.dtotal_time + return self.processed_bytes / self.__total_time except: return 0 def cspeed(self): try: - return self.transferred_size / int(time() - self.start_time) + return self.transferred_size / int(time() - self.__start_time) except: return 0 @@ -99,12 +88,12 @@ class GoogleDriveHelper: parsed = urlparse(link) return parse_qs(parsed.query)['id'][0] - def _on_upload_progress(self): - if self.status is not None: - chunk_size = self.status.total_size * self.status.progress() - self._file_uploaded_bytes - self._file_uploaded_bytes = self.status.total_size * self.status.progress() - self.uploaded_bytes += chunk_size - self.total_time += self.update_interval + def _progress(self): + if self.__status is not None: + chunk_size = self.__status.total_size * self.__status.progress() - self._file_processed_bytes + self._file_processed_bytes = self.__status.total_size * self.__status.progress() + self.processed_bytes += chunk_size + self.__total_time += self.__update_interval def deletefile(self, link: str): try: @@ -189,10 +178,10 @@ class GoogleDriveHelper: body=file_metadata, media_body=media_body) response = None while response is None: - if self.is_cancelled: + if self.__is_cancelled: break try: - self.status, response = drive_file.next_chunk() + self.__status, response = drive_file.next_chunk() except HttpError as err: if err.resp.get('content-type', '').startswith('application/json'): reason = jsnloads(err.content).get('error').get('errors')[0].get('reason') @@ -208,9 +197,14 @@ class GoogleDriveHelper: else: LOGGER.error(f"Got: {reason}") raise err - if self.is_cancelled: + if self.__is_cancelled: return - self._file_uploaded_bytes = 0 + if not self.__listener.seed or self.__listener.newDir: + try: + osremove(file_path) + except: + pass + self._file_processed_bytes = 0 # Insert new permissions if not IS_TEAM_DRIVE: self.__set_permission(response['id']) @@ -220,22 +214,20 @@ class GoogleDriveHelper: return download_url def upload(self, file_name: str): - self.is_downloading = False - self.is_uploading = True - file_dir = f"{DOWNLOAD_DIR}{self.__listener.message.message_id}" - file_path = f"{file_dir}/{file_name}" - size = get_readable_file_size(get_path_size(file_path)) - LOGGER.info("Uploading File: " + file_path) - self.updater = setInterval(self.update_interval, self._on_upload_progress) + self.__is_uploading = True + file_path = f"{self.__path}/{file_name}" + size = get_readable_file_size(self.__size) + LOGGER.info(f"Uploading File: {file_path}") + self.__updater = setInterval(self.__update_interval, self._progress) try: if ospath.isfile(file_path): mime_type = get_mime_type(file_path) link = self.__upload_file(file_path, file_name, mime_type, parent_id) - if self.is_cancelled: + if self.__is_cancelled: return if link is None: raise Exception('Upload has been manually cancelled') - LOGGER.info("Uploaded To G-Drive: " + file_path) + LOGGER.info(f"Uploaded To G-Drive: {file_path}") else: mime_type = 'Folder' dir_id = self.__create_directory(ospath.basename(ospath.abspath(file_name)), parent_id) @@ -243,24 +235,24 @@ class GoogleDriveHelper: if result is None: raise Exception('Upload has been manually cancelled!') link = f"https://drive.google.com/folderview?id={dir_id}" - if self.is_cancelled: + if self.__is_cancelled: return - LOGGER.info("Uploaded To G-Drive: " + file_name) + LOGGER.info(f"Uploaded To G-Drive: {file_name}") except Exception as err: if isinstance(err, RetryError): LOGGER.info(f"Total Attempts: {err.last_attempt.attempt_number}") err = err.last_attempt.exception() self.__listener.onUploadError(str(err)) - self.is_errored = True + self.__is_errored = True finally: - self.updater.cancel() - if self.is_cancelled and not self.is_errored: + self.__updater.cancel() + if self.__is_cancelled and not self.__is_errored: if mime_type == 'Folder': LOGGER.info("Deleting uploaded data from Drive...") link = f"https://drive.google.com/folderview?id={dir_id}" self.deletefile(link) return - elif self.is_errored: + elif self.__is_errored: return self.__listener.onUploadComplete(link, size, self.__total_files, self.__total_folders, mime_type, self.name) @@ -284,13 +276,13 @@ class GoogleDriveHelper: if reason in ['userRateLimitExceeded', 'dailyLimitExceeded']: if USE_SERVICE_ACCOUNTS: if self.__sa_count == len(listdir("accounts")) or self.__sa_count > 50: - self.is_cancelled = True + self.__is_cancelled = True raise err else: self.__switchServiceAccount() return self.__copyFile(file_id, dest_id) else: - self.is_cancelled = True + self.__is_cancelled = True LOGGER.error(f"Got: {reason}") raise err else: @@ -324,8 +316,8 @@ class GoogleDriveHelper: return files def clone(self, link): - self.is_cloning = True - self.start_time = time() + self.__is_cloning = True + self.__start_time = time() self.__total_files = 0 self.__total_folders = 0 try: @@ -342,7 +334,7 @@ class GoogleDriveHelper: dir_id = self.__create_directory(meta.get('name'), parent_id) self.__cloneFolder(meta.get('name'), meta.get('name'), meta.get('id'), dir_id) durl = self.__G_DRIVE_DIR_BASE_DOWNLOAD_URL.format(dir_id) - if self.is_cancelled: + if self.__is_cancelled: LOGGER.info("Deleting cloned data from Drive...") self.deletefile(durl) return "your clone has been stopped and cloned data has been deleted!", "cancelled" @@ -407,7 +399,7 @@ class GoogleDriveHelper: self.__total_files += 1 self.transferred_size += int(file.get('size', 0)) self.__copyFile(file.get('id'), parent_id) - if self.is_cancelled: + if self.__is_cancelled: break @retry(wait=wait_exponential(multiplier=2, min=3, max=6), stop=stop_after_attempt(3), @@ -445,7 +437,7 @@ class GoogleDriveHelper: self.__upload_file(current_file_name, file_name, mime_type, parent_id) self.__total_files += 1 new_id = parent_id - if self.is_cancelled: + if self.__is_cancelled: break return new_id @@ -467,8 +459,8 @@ class GoogleDriveHelper: def __alt_authorize(self): credentials = None - if USE_SERVICE_ACCOUNTS and not self.alt_auth: - self.alt_auth = True + if USE_SERVICE_ACCOUNTS and not self.__alt_auth: + self.__alt_auth = True if ospath.exists(self.__G_DRIVE_TOKEN_FILE): LOGGER.info("Authorize with token.pickle") with open(self.__G_DRIVE_TOKEN_FILE, 'rb') as f: @@ -479,7 +471,7 @@ class GoogleDriveHelper: def __escapes(self, str): chars = ['\\', "'", '"', r'\a', r'\b', r'\f', r'\n', r'\r', r'\t'] for char in chars: - str = str.replace(char, '\\' + char) + str = str.replace(char, f'\\{char}') return str.strip() def __get_recursive_list(self, file, rootid): @@ -522,7 +514,7 @@ class GoogleDriveHelper: if parent_id == "root": return ( self.__service.files() - .list(q=query + " and 'me' in owners", + .list(q=f"{query} and 'me' in owners", pageSize=200, spaces='drive', fields='files(id, name, mimeType, size, parents)', @@ -581,92 +573,78 @@ class GoogleDriveHelper: msg = "" fileName = self.__escapes(str(fileName)) contents_count = 0 - telegraph_content = [] - path = [] Title = False if len(DRIVES_IDS) > 1: token_service = self.__alt_authorize() if token_service is not None: self.__service = token_service for index, parent_id in enumerate(DRIVES_IDS): - if isRecursive and len(parent_id) > 23: - isRecur = False - else: - isRecur = isRecursive + isRecur = False if isRecursive and len(parent_id) > 23 else isRecursive response = self.__drive_query(parent_id, fileName, stopDup, isRecur, itemType) - if not response["files"] and noMulti: - break - elif not response["files"]: - continue + if not response["files"]: + if noMulti: + break + else: + continue if not Title: - msg += f'

Search Result For {fileName}

' + msg += '' \ + f'

Search Result For {fileName}

' Title = True if len(DRIVES_NAMES) > 1 and DRIVES_NAMES[index] is not None: - msg += f"╾────────────╼
{DRIVES_NAMES[index]}
╾────────────╼
" + msg += '' \ + f'{DRIVES_NAMES[index]}' for file in response.get('files', []): mime_type = file.get('mimeType') if mime_type == "application/vnd.google-apps.folder": furl = f"https://drive.google.com/drive/folders/{file.get('id')}" - msg += f"📁 {file.get('name')}
(folder)

" - msg += f"Drive Link" + msg += '' \ + f"
📁 {file.get('name')} (folder)
" \ + '