mirror of
https://github.com/anasty17/mirror-leech-telegram-bot.git
synced 2025-01-08 12:07:33 +08:00
Seed after leech and extract
- Send html file instead of telegraph, no limits no flooding (by @junedkh) - Seed using aria2c - Remove QB_SEED var - Add ability to specify ratio and seed-time for each torrent - watch commands changed - Added alt cmd for all mirror, leech and rss commands - Add seed speed to overall upload speed - Mirror TG photos - Bugs fixed Signed-off-by: anasty17 <e.anastayyar@gmail.com>
This commit is contained in:
parent
64a840bd08
commit
1cc1c5b524
26
README.md
26
README.md
@ -7,7 +7,7 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to
|
||||
- Select files from Torrent before downloading using qbittorrent and aria2c.
|
||||
- Leech (splitting, thumbnail for each user, setting as document or as media for each user).
|
||||
- Stop duplicates for all tasks except yt-dlp tasks.
|
||||
- Zip/Unzip G-Drive links.
|
||||
- Leech/Zip/Unzip G-Drive links.
|
||||
- Counting files/folders from Google Drive link.
|
||||
- View Link button, extra button to open file index link in broswer instead of direct download.
|
||||
- Status Pages for unlimited tasks.
|
||||
@ -24,7 +24,7 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to
|
||||
- Search on torrents with Torrent Search API or with variable plugins using qBittorrent search engine
|
||||
- Docker image support for linux `amd64, arm64/v8, arm/v7, s390x`.
|
||||
- Update bot at startup and with restart command using `UPSTREAM_REPO`.
|
||||
- Qbittorrent seed until reaching specific ratio or time.
|
||||
- Bittorrent seed until reaching specific ratio or time.
|
||||
- Rss feed and filter. Based on this repository [rss-chan](https://github.com/hyPnOtICDo0g/rss-chan).
|
||||
- Save leech settings including thumbnails in database.
|
||||
- Mirror/Leech/Clone multi links/files with one command.
|
||||
@ -34,7 +34,7 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to
|
||||
- Custom Name for all links except torrents. For files you should add extension except yt-dlp links.
|
||||
- Many bugs have been fixed.
|
||||
|
||||
## From Other Repositories
|
||||
## From Base and other Repositories
|
||||
- Mirror direct download links, Torrent, and Telegram files to Google Drive
|
||||
- Mirror Mega.nz links to Google Drive
|
||||
- Copy files from someone's Drive to your Drive (Using Autorclone)
|
||||
@ -49,11 +49,11 @@ This is a Telegram Bot written in Python for mirroring files on the Internet to
|
||||
- Shell and Executor
|
||||
- Add sudo users
|
||||
- Extract password protected files
|
||||
- Extract these filetypes and uploads to Google Drive
|
||||
- Extract these filetypes
|
||||
> ZIP, RAR, TAR, 7z, ISO, WIM, CAB, GZIP, BZIP2, APM, ARJ, CHM, CPIO, CramFS, DEB, DMG, FAT, HFS, LZH, LZMA, LZMA2, MBR, MSI, MSLZ, NSIS, NTFS, RPM, SquashFS, UDF, VHD, XAR, Z, TAR.XZ
|
||||
|
||||
- Direct links Supported:
|
||||
>mediafire, letsupload.io, hxfile.co, anonfiles.com, bayfiles.com, antfiles, fembed.com, fembed.net, femax20.com, layarkacaxxi.icu, fcdn.stream, sbplay.org, naniplay.com, naniplay.nanime.in, naniplay.nanime.biz, sbembed.com, streamtape.com, streamsb.net, feurl.com, upload.ee, pixeldrain.com, racaty.net, 1fichier.com, 1drv.ms (Only works for file not folder or business account), uptobox.com (Uptobox account must be premium) and solidfiles.com
|
||||
>mediafire, letsupload.io, hxfile.co, anonfiles.com, bayfiles.com, antfiles, fembed.com, fembed.net, femax20.com, layarkacaxxi.icu, fcdn.stream, sbplay.org, naniplay.com, naniplay.nanime.in, naniplay.nanime.biz, sbembed.com, streamtape.com, streamsb.net, feurl.com, upload.ee, pixeldrain.com, racaty.net, 1fichier.com, 1drv.ms (Only works for file not folder or business account), uptobox.com and solidfiles.com
|
||||
|
||||
# How to deploy?
|
||||
|
||||
@ -140,7 +140,6 @@ Fill up rest of the fields. Meaning of each field is discussed below:
|
||||
- `BASE_URL_OF_BOT`: Valid BASE URL where the bot is deployed to use qbittorrent web selection. Format of URL should be `http://myip`, where `myip` is the IP/Domain(public) of your bot or if you have chosen port other than `80` so write it in this format `http://myip:port` (`http` and not `https`). This Var is optional on VPS and required for Heroku specially to avoid app sleeping/idling. For Heroku fill `https://yourappname.herokuapp.com`. Still got idling? You can use http://cron-job.org to ping your Heroku app. `Str`
|
||||
- `SERVER_PORT`: Only For VPS, which is the **BASE_URL_OF_BOT** Port. `Str`
|
||||
- `WEB_PINCODE`: If empty or `False` means no more pincode required while qbit web selection. `Bool`
|
||||
- `QB_SEED`: QB torrent will be seeded after and while uploading until reaching specific ratio or time, edit `MaxRatio` or `GlobalMaxSeedingMinutes` or both from qbittorrent.conf (`-1` means no limit, but u can cancel manually by gid). **NOTE**: 1. Don't change `MaxRatioAction`, 2. Only works with `/qbmirror` and `/qbzipmirror`. Also you can use this feature for specific torrent while using the bot and leave this variable empty. Default is `False`. `Bool`
|
||||
- **Qbittorrent NOTE**: If your facing ram exceeded issue then set limit for `MaxConnecs`, decrease `AsyncIOThreadsCount` in qbittorrent config and set limit of `DiskWriteCacheSize` to `32`.
|
||||
|
||||
### RSS
|
||||
@ -332,7 +331,22 @@ help - All cmds with description
|
||||
|
||||
------
|
||||
|
||||
## Bittorrent Seed
|
||||
|
||||
- Add `d:ratio:time` perfix along with leech or mirror cmd.
|
||||
- Using `d` perfix alone will lead to use global options for aria2c or qbittorrent.
|
||||
|
||||
### Qbittorrent
|
||||
- Global options: `MaxRatio` and `GlobalMaxSeedingMinutes` in qbittorrent.conf, `-1` means no limit, but you can cancel manually.
|
||||
- **NOTE**: Don't change `MaxRatioAction`.
|
||||
|
||||
### Aria2c
|
||||
- Global options: `--seed-ratio` (0 means no limit) and `--seed-time` (0 means no seed) in aria.sh.
|
||||
|
||||
------
|
||||
|
||||
## Using Service Accounts for uploading to avoid user rate limit
|
||||
|
||||
>For Service Account to work, you must set `USE_SERVICE_ACCOUNTS` = "True" in config file or environment variables.
|
||||
>**NOTE**: Using Service Accounts is only recommended while uploading to a Team Drive.
|
||||
|
||||
|
17
aria.sh
17
aria.sh
@ -3,12 +3,11 @@ then
|
||||
TORRENT_TIMEOUT=0
|
||||
fi
|
||||
tracker_list=$(curl -Ns https://raw.githubusercontent.com/XIU2/TrackersListCollection/master/all.txt https://ngosang.github.io/trackerslist/trackers_all_http.txt https://newtrackon.com/api/all https://raw.githubusercontent.com/hezhijie0327/Trackerslist/main/trackerslist_tracker.txt | awk '$0' | tr '\n\n' ',')
|
||||
aria2c --enable-rpc=true --check-certificate=false --daemon=true \
|
||||
--max-connection-per-server=10 --rpc-max-request-size=1024M --quiet=true \
|
||||
--bt-stop-timeout=$TORRENT_TIMEOUT --min-split-size=10M --split=10 --allow-overwrite=true \
|
||||
--max-overall-download-limit=0 --bt-tracker="[$tracker_list]" --disk-cache=32M \
|
||||
--max-overall-upload-limit=1K --max-concurrent-downloads=15 --summary-interval=0 \
|
||||
--peer-id-prefix=-qB4430- --user-agent=Wget/1.12 --peer-agent=qBittorrent/4.4.3\
|
||||
--bt-enable-lpd=true --seed-time=0 --max-file-not-found=0 --max-tries=20 --follow-torrent=mem \
|
||||
--auto-file-renaming=true --reuse-uri=true --http-accept-gzip=true --continue=true \
|
||||
--content-disposition-default-utf8=true --netrc-path=/usr/src/app/.netrc --bt-remove-unselected-file=true
|
||||
aria2c --allow-overwrite=true --auto-file-renaming=true --bt-enable-lpd=true \
|
||||
--bt-remove-unselected-file=true --bt-stop-timeout=$TORRENT_TIMEOUT --bt-tracker="[$tracker_list]" \
|
||||
--check-certificate=false --continue=true --content-disposition-default-utf8=true --daemon=true \
|
||||
--disk-cache=40M --enable-rpc=true --follow-torrent=mem --force-save=true --http-accept-gzip=true \
|
||||
--max-connection-per-server=10 --max-concurrent-downloads=20 --max-file-not-found=0 --max-tries=20 \
|
||||
--min-split-size=10M --netrc-path=/usr/src/app/.netrc --optimize-concurrent-downloads=true \
|
||||
--peer-id-prefix=-qB4430- --peer-agent=qBittorrent/4.4.3 --quiet=true --reuse-uri=true \
|
||||
--rpc-max-request-size=1024M --seed-ratio=0 --split=10 --summary-interval=0 --user-agent=Wget/1.12
|
||||
|
@ -107,7 +107,7 @@ AUTHORIZED_CHATS = set()
|
||||
SUDO_USERS = set()
|
||||
AS_DOC_USERS = set()
|
||||
AS_MEDIA_USERS = set()
|
||||
EXTENSION_FILTER = set()
|
||||
EXTENSION_FILTER = set(['.aria2'])
|
||||
|
||||
try:
|
||||
aid = getConfig('AUTHORIZED_CHATS')
|
||||
@ -151,19 +151,19 @@ try:
|
||||
USER_SESSION_STRING = getConfig('USER_SESSION_STRING')
|
||||
if len(USER_SESSION_STRING) == 0:
|
||||
raise KeyError
|
||||
LOGGER.info("Creating client from USER_SESSION_STRING")
|
||||
log_info("Creating client from USER_SESSION_STRING")
|
||||
app = Client(name='pyrogram', api_id=int(TELEGRAM_API), api_hash=TELEGRAM_HASH, session_string=USER_SESSION_STRING, parse_mode=enums.ParseMode.HTML, no_updates=True)
|
||||
with app:
|
||||
IS_PREMIUM_USER = app.me.is_premium
|
||||
except:
|
||||
LOGGER.info("Creating client from BOT_TOKEN")
|
||||
log_info("Creating client from BOT_TOKEN")
|
||||
app = Client(name='pyrogram', api_id=int(TELEGRAM_API), api_hash=TELEGRAM_HASH, bot_token=BOT_TOKEN, parse_mode=enums.ParseMode.HTML, no_updates=True)
|
||||
|
||||
try:
|
||||
RSS_USER_SESSION_STRING = getConfig('RSS_USER_SESSION_STRING')
|
||||
if len(RSS_USER_SESSION_STRING) == 0:
|
||||
raise KeyError
|
||||
LOGGER.info("Creating client from RSS_USER_SESSION_STRING")
|
||||
log_info("Creating client from RSS_USER_SESSION_STRING")
|
||||
rss_session = Client(name='rss_session', api_id=int(TELEGRAM_API), api_hash=TELEGRAM_HASH, session_string=RSS_USER_SESSION_STRING, parse_mode=enums.ParseMode.HTML, no_updates=True)
|
||||
except:
|
||||
rss_session = None
|
||||
@ -212,10 +212,8 @@ try:
|
||||
raise KeyError
|
||||
LEECH_SPLIT_SIZE = int(LEECH_SPLIT_SIZE)
|
||||
except:
|
||||
if not IS_PREMIUM_USER:
|
||||
LEECH_SPLIT_SIZE = 2097152000
|
||||
else:
|
||||
LEECH_SPLIT_SIZE = 4194304000
|
||||
LEECH_SPLIT_SIZE = 4194304000 if IS_PREMIUM_USER else 2097152000
|
||||
MAX_SPLIT_SIZE = 4194304000 if IS_PREMIUM_USER else 2097152000
|
||||
try:
|
||||
STATUS_LIMIT = getConfig('STATUS_LIMIT')
|
||||
if len(STATUS_LIMIT) == 0:
|
||||
@ -335,11 +333,6 @@ try:
|
||||
EQUAL_SPLITS = EQUAL_SPLITS.lower() == 'true'
|
||||
except:
|
||||
EQUAL_SPLITS = False
|
||||
try:
|
||||
QB_SEED = getConfig('QB_SEED')
|
||||
QB_SEED = QB_SEED.lower() == 'true'
|
||||
except:
|
||||
QB_SEED = False
|
||||
try:
|
||||
CUSTOM_FILENAME = getConfig('CUSTOM_FILENAME')
|
||||
if len(CUSTOM_FILENAME) == 0:
|
||||
|
187
bot/__main__.py
187
bot/__main__.py
@ -9,16 +9,13 @@ from telegram.ext import CommandHandler
|
||||
|
||||
from bot import bot, dispatcher, updater, botStartTime, IGNORE_PENDING_REQUESTS, LOGGER, Interval, INCOMPLETE_TASK_NOTIFIER, DB_URI, app, main_loop
|
||||
from .helper.ext_utils.fs_utils import start_cleanup, clean_all, exit_clean_up
|
||||
from .helper.ext_utils.telegraph_helper import telegraph
|
||||
from .helper.ext_utils.bot_utils import get_readable_file_size, get_readable_time
|
||||
from .helper.ext_utils.db_handler import DbManger
|
||||
from .helper.telegram_helper.bot_commands import BotCommands
|
||||
from .helper.telegram_helper.message_utils import sendMessage, sendMarkup, editMessage, sendLogFile
|
||||
from .helper.telegram_helper.filters import CustomFilters
|
||||
from .helper.telegram_helper.button_build import ButtonMaker
|
||||
|
||||
from .modules import authorize, list, cancel_mirror, mirror_status, mirror, clone, watch, shell, eval, delete, count, leech_settings, search, rss, bt_select
|
||||
|
||||
from .modules import authorize, list, cancel_mirror, mirror_status, mirror_leech, clone, ytdlp, shell, eval, delete, count, leech_settings, search, rss, bt_select
|
||||
|
||||
|
||||
def stats(update, context):
|
||||
@ -102,148 +99,60 @@ def ping(update, context):
|
||||
def log(update, context):
|
||||
sendLogFile(context.bot, update.message)
|
||||
|
||||
|
||||
help_string_telegraph = f'''<br>
|
||||
<b>/{BotCommands.HelpCommand}</b>: To get this message
|
||||
<br><br>
|
||||
<b>/{BotCommands.MirrorCommand}</b> [download_url][magnet_link]: Start mirroring to Google Drive. Send <b>/{BotCommands.MirrorCommand}</b> for more help
|
||||
<br><br>
|
||||
<b>/{BotCommands.ZipMirrorCommand}</b> [download_url][magnet_link]: Start mirroring and upload the file/folder compressed with zip extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.UnzipMirrorCommand}</b> [download_url][magnet_link]: Start mirroring and upload the file/folder extracted from any archive extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.QbMirrorCommand}</b> [magnet_link][torrent_file][torrent_file_url]: Start Mirroring using qBittorrent. Send <b>/{BotCommands.QbMirrorCommand}</b> for more help
|
||||
<br><br>
|
||||
<b>/{BotCommands.QbZipMirrorCommand}</b> [magnet_link][torrent_file][torrent_file_url]: Start mirroring using qBittorrent and upload the file/folder compressed with zip extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.QbUnzipMirrorCommand}</b> [magnet_link][torrent_file][torrent_file_url]: Start mirroring using qBittorrent and upload the file/folder extracted from any archive extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.LeechCommand}</b> [download_url][magnet_link]: Start leeching to Telegram, Use <b>/{BotCommands.LeechCommand} s</b> to select files before leeching
|
||||
<br><br>
|
||||
<b>/{BotCommands.ZipLeechCommand}</b> [download_url][magnet_link]: Start leeching to Telegram and upload the file/folder compressed with zip extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.UnzipLeechCommand}</b> [download_url][magnet_link][torent_file]: Start leeching to Telegram and upload the file/folder extracted from any archive extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.QbLeechCommand}</b> [magnet_link][torrent_file][torrent_file_url]: Start leeching to Telegram using qBittorrent, Use <b>/{BotCommands.QbLeechCommand} s</b> to select files before leeching
|
||||
<br><br>
|
||||
<b>/{BotCommands.QbZipLeechCommand}</b> [magnet_link][torrent_file][torrent_file_url]: Start leeching to Telegram using qBittorrent and upload the file/folder compressed with zip extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.QbUnzipLeechCommand}</b> [magnet_link][torrent_file][torrent_file_url]: Start leeching to Telegram using qBittorrent and upload the file/folder extracted from any archive extension
|
||||
<br><br>
|
||||
<b>/{BotCommands.CloneCommand}</b> [drive_url][gdtot_url]: Copy file/folder to Google Drive
|
||||
<br><br>
|
||||
<b>/{BotCommands.CountCommand}</b> [drive_url][gdtot_url]: Count file/folder of Google Drive
|
||||
<br><br>
|
||||
<b>/{BotCommands.DeleteCommand}</b> [drive_url]: Delete file/folder from Google Drive (Only Owner & Sudo)
|
||||
<br><br>
|
||||
<b>/{BotCommands.WatchCommand}</b> [yt-dlp supported link]: Mirror yt-dlp supported link. Send <b>/{BotCommands.WatchCommand}</b> for more help
|
||||
<br><br>
|
||||
<b>/{BotCommands.ZipWatchCommand}</b> [yt-dlp supported link]: Mirror yt-dlp supported link as zip
|
||||
<br><br>
|
||||
<b>/{BotCommands.LeechWatchCommand}</b> [yt-dlp supported link]: Leech yt-dlp supported link
|
||||
<br><br>
|
||||
<b>/{BotCommands.LeechZipWatchCommand}</b> [yt-dlp supported link]: Leech yt-dlp supported link as zip
|
||||
<br><br>
|
||||
<b>/{BotCommands.LeechSetCommand}</b>: Leech settings
|
||||
<br><br>
|
||||
<b>/{BotCommands.SetThumbCommand}</b>: Reply photo to set it as Thumbnail
|
||||
<br><br>
|
||||
<b>/{BotCommands.BtSelectCommand}</b>: Reply to an active /cmd which was used to start the bt-download or add gid along with cmd. This command mainly for selection incase you decided to select files from already added torrent. But you can always use /cmd with arg `s` to select files before download start
|
||||
<br><br>
|
||||
<b>/{BotCommands.RssListCommand}</b>: List all subscribed rss feed info
|
||||
<br><br>
|
||||
<b>/{BotCommands.RssGetCommand}</b>: [Title] [Number](last N links): Force fetch last N links
|
||||
<br><br>
|
||||
<b>/{BotCommands.RssSubCommand}</b>: [Title] [Rss Link] f: [filter]: Subscribe new rss feed
|
||||
<br><br>
|
||||
<b>/{BotCommands.RssUnSubCommand}</b>: [Title]: Unubscribe rss feed by title
|
||||
<br><br>
|
||||
<b>/{BotCommands.RssSettingsCommand}</b>: Rss Settings
|
||||
<br><br>
|
||||
<b>/{BotCommands.CancelMirror}</b>: Reply to the message by which the download was initiated and that download will be cancelled
|
||||
<br><br>
|
||||
<b>/{BotCommands.CancelAllCommand}</b>: Cancel all downloading tasks
|
||||
<br><br>
|
||||
<b>/{BotCommands.ListCommand}</b> [query]: Search in Google Drive(s)
|
||||
<br><br>
|
||||
<b>/{BotCommands.SearchCommand}</b> [query]: Search for torrents with API
|
||||
<br>sites: <code>rarbg, 1337x, yts, etzv, tgx, torlock, piratebay, nyaasi, ettv</code><br><br>
|
||||
<b>/{BotCommands.StatusCommand}</b>: Shows a status of all the downloads
|
||||
<br><br>
|
||||
<b>/{BotCommands.StatsCommand}</b>: Show Stats of the machine the bot is hosted on
|
||||
'''
|
||||
|
||||
help = telegraph.create_page(
|
||||
title='Mirror-Leech-Bot Help',
|
||||
content=help_string_telegraph,
|
||||
)["path"]
|
||||
|
||||
help_string = f'''
|
||||
/{BotCommands.PingCommand}: Check how long it takes to Ping the Bot
|
||||
|
||||
/{BotCommands.AuthorizeCommand}: Authorize a chat or a user to use the bot (Can only be invoked by Owner & Sudo of the bot)
|
||||
|
||||
/{BotCommands.UnAuthorizeCommand}: Unauthorize a chat or a user to use the bot (Can only be invoked by Owner & Sudo of the bot)
|
||||
|
||||
/{BotCommands.AuthorizedUsersCommand}: Show authorized users (Only Owner & Sudo)
|
||||
|
||||
/{BotCommands.AddSudoCommand}: Add sudo user (Only Owner)
|
||||
|
||||
/{BotCommands.RmSudoCommand}: Remove sudo users (Only Owner)
|
||||
|
||||
/{BotCommands.RestartCommand}: Restart and update the bot
|
||||
|
||||
/{BotCommands.LogCommand}: Get a log file of the bot. Handy for getting crash reports
|
||||
|
||||
/{BotCommands.ShellCommand}: Run commands in Shell (Only Owner)
|
||||
|
||||
/{BotCommands.ExecHelpCommand}: Get help for Executor module (Only Owner)
|
||||
NOTE: Try each command without any perfix to see more detalis.
|
||||
/{BotCommands.MirrorCommand[0]} or /{BotCommands.MirrorCommand[1]}: Start mirroring to Google Drive.
|
||||
/{BotCommands.ZipMirrorCommand[0]} or /{BotCommands.ZipMirrorCommand[1]}: Start mirroring and upload the file/folder compressed with zip extension.
|
||||
/{BotCommands.UnzipMirrorCommand[0]} or /{BotCommands.UnzipMirrorCommand[1]}: Start mirroring and upload the file/folder extracted from any archive extension.
|
||||
/{BotCommands.QbMirrorCommand[0]} or /{BotCommands.QbMirrorCommand[1]}: Start Mirroring to Google Drive using qBittorrent.
|
||||
/{BotCommands.QbZipMirrorCommand[0]} or /{BotCommands.QbZipMirrorCommand[1]}: Start mirroring using qBittorrent and upload the file/folder compressed with zip extension.
|
||||
/{BotCommands.QbUnzipMirrorCommand[0]} or /{BotCommands.QbUnzipMirrorCommand[1]}: Start mirroring using qBittorrent and upload the file/folder extracted from any archive extension.
|
||||
/{BotCommands.YtdlCommand[0]} or /{BotCommands.YtdlCommand[1]}: Mirror yt-dlp supported link.
|
||||
/{BotCommands.YtdlZipCommand[0]} or /{BotCommands.YtdlZipCommand[1]}: Mirror yt-dlp supported link as zip.
|
||||
/{BotCommands.LeechCommand[0]} or /{BotCommands.LeechCommand[1]}: Start leeching to Telegram.
|
||||
/{BotCommands.ZipLeechCommand[0]} or /{BotCommands.ZipLeechCommand[1]}: Start leeching and upload the file/folder compressed with zip extension.
|
||||
/{BotCommands.UnzipLeechCommand[0]} or /{BotCommands.UnzipLeechCommand[1]}: Start leeching and upload the file/folder extracted from any archive extension.
|
||||
/{BotCommands.QbLeechCommand[0]} or /{BotCommands.QbLeechCommand[1]}: Start leeching using qBittorrent.
|
||||
/{BotCommands.QbZipLeechCommand[0]} or /{BotCommands.QbZipLeechCommand[1]}: Start leeching using qBittorrent and upload the file/folder compressed with zip extension.
|
||||
/{BotCommands.QbUnzipLeechCommand[0]} or /{BotCommands.QbUnzipLeechCommand[1]}: Start leeching using qBittorrent and upload the file/folder extracted from any archive extension.
|
||||
/{BotCommands.YtdlLeechCommand[0]} or /{BotCommands.YtdlLeechCommand[1]}: Leech yt-dlp supported link.
|
||||
/{BotCommands.YtdlZipLeechCommand[0]} or /{BotCommands.YtdlZipLeechCommand[1]}: Leech yt-dlp supported link as zip.
|
||||
/{BotCommands.CloneCommand} [drive_url]: Copy file/folder to Google Drive.
|
||||
/{BotCommands.CountCommand} [drive_url]: Count file/folder of Google Drive.
|
||||
/{BotCommands.DeleteCommand} [drive_url]: Delete file/folder from Google Drive (Only Owner & Sudo).
|
||||
/{BotCommands.LeechSetCommand} [query]: Leech settings.
|
||||
/{BotCommands.SetThumbCommand}: Reply photo to set it as Thumbnail.
|
||||
/{BotCommands.BtSelectCommand}: Select files from torrents by gid or reply.
|
||||
/{BotCommands.RssListCommand[0]} or /{BotCommands.RssListCommand[1]}: List all subscribed rss feed info (Only Owner & Sudo).
|
||||
/{BotCommands.RssGetCommand[0]} or /{BotCommands.RssGetCommand[1]}: Force fetch last N links (Only Owner & Sudo).
|
||||
/{BotCommands.RssSubCommand[0]} or /{BotCommands.RssSubCommand[1]}: Subscribe new rss feed (Only Owner & Sudo).
|
||||
/{BotCommands.RssUnSubCommand[0]} or /{BotCommands.RssUnSubCommand[1]}: Unubscribe rss feed by title (Only Owner & Sudo).
|
||||
/{BotCommands.RssSettingsCommand[0]} or /{BotCommands.RssSettingsCommand[1]} [query]: Rss Settings (Only Owner & Sudo).
|
||||
/{BotCommands.CancelMirror}: Cancel task by gid or reply.
|
||||
/{BotCommands.CancelAllCommand} [query]: Cancel all [status] tasks.
|
||||
/{BotCommands.ListCommand} [query]: Search in Google Drive(s).
|
||||
/{BotCommands.SearchCommand} [query]: Search for torrents with API.
|
||||
/{BotCommands.StatusCommand}: Shows a status of all the downloads.
|
||||
/{BotCommands.StatsCommand}: Show stats of the machine where the bot is hosted in.
|
||||
/{BotCommands.PingCommand}: Check how long it takes to Ping the Bot (Only Owner & Sudo).
|
||||
/{BotCommands.AuthorizeCommand}: Authorize a chat or a user to use the bot (Only Owner & Sudo).
|
||||
/{BotCommands.UnAuthorizeCommand}: Unauthorize a chat or a user to use the bot (Only Owner & Sudo).
|
||||
/{BotCommands.AuthorizedUsersCommand}: Show authorized users (Only Owner & Sudo).
|
||||
/{BotCommands.AddSudoCommand}: Add sudo user (Only Owner).
|
||||
/{BotCommands.RmSudoCommand}: Remove sudo users (Only Owner).
|
||||
/{BotCommands.RestartCommand}: Restart and update the bot (Only Owner & Sudo).
|
||||
/{BotCommands.LogCommand}: Get a log file of the bot. Handy for getting crash reports (Only Owner & Sudo).
|
||||
/{BotCommands.ShellCommand}: Run shell commands (Only Owner).
|
||||
/{BotCommands.EvalCommand}: Run Python Code Line | Lines (Only Owner).
|
||||
/{BotCommands.ExecCommand}: Run Commands In Exec (Only Owner).
|
||||
/{BotCommands.ClearLocalsCommand}: Clear {BotCommands.EvalCommand} or {BotCommands.ExecCommand} locals (Only Owner).
|
||||
'''
|
||||
|
||||
def bot_help(update, context):
|
||||
button = ButtonMaker()
|
||||
button.buildbutton("Other Commands", f"https://telegra.ph/{help}")
|
||||
reply_markup = InlineKeyboardMarkup(button.build_menu(1))
|
||||
sendMarkup(help_string, context.bot, update.message, reply_markup)
|
||||
|
||||
botcmds = [
|
||||
|
||||
(f'{BotCommands.MirrorCommand}', 'Mirror'),
|
||||
(f'{BotCommands.ZipMirrorCommand}','Mirror and upload as zip'),
|
||||
(f'{BotCommands.UnzipMirrorCommand}','Mirror and extract files'),
|
||||
(f'{BotCommands.QbMirrorCommand}','Mirror torrent using qBittorrent'),
|
||||
(f'{BotCommands.QbZipMirrorCommand}','Mirror torrent and upload as zip using qb'),
|
||||
(f'{BotCommands.QbUnzipMirrorCommand}','Mirror torrent and extract files using qb'),
|
||||
(f'{BotCommands.WatchCommand}','Mirror yt-dlp supported link'),
|
||||
(f'{BotCommands.ZipWatchCommand}','Mirror yt-dlp supported link as zip'),
|
||||
(f'{BotCommands.CloneCommand}','Copy file/folder to Drive'),
|
||||
(f'{BotCommands.LeechCommand}','Leech'),
|
||||
(f'{BotCommands.ZipLeechCommand}','Leech and upload as zip'),
|
||||
(f'{BotCommands.UnzipLeechCommand}','Leech and extract files'),
|
||||
(f'{BotCommands.QbLeechCommand}','Leech torrent using qBittorrent'),
|
||||
(f'{BotCommands.QbZipLeechCommand}','Leech torrent and upload as zip using qb'),
|
||||
(f'{BotCommands.QbUnzipLeechCommand}','Leech torrent and extract using qb'),
|
||||
(f'{BotCommands.LeechWatchCommand}','Leech yt-dlp supported link'),
|
||||
(f'{BotCommands.LeechZipWatchCommand}','Leech yt-dlp supported link as zip'),
|
||||
(f'{BotCommands.CountCommand}','Count file/folder of Drive'),
|
||||
(f'{BotCommands.DeleteCommand}','Delete file/folder from Drive'),
|
||||
(f'{BotCommands.CancelMirror}','Cancel a task'),
|
||||
(f'{BotCommands.CancelAllCommand}','Cancel all downloading tasks'),
|
||||
(f'{BotCommands.ListCommand}','Search in Drive'),
|
||||
(f'{BotCommands.LeechSetCommand}','Leech settings'),
|
||||
(f'{BotCommands.SetThumbCommand}','Set thumbnail'),
|
||||
(f'{BotCommands.StatusCommand}','Get mirror status message'),
|
||||
(f'{BotCommands.StatsCommand}','Bot usage stats'),
|
||||
(f'{BotCommands.PingCommand}','Ping the bot'),
|
||||
(f'{BotCommands.RestartCommand}','Restart the bot'),
|
||||
(f'{BotCommands.LogCommand}','Get the bot Log'),
|
||||
(f'{BotCommands.HelpCommand}','Get detailed help')
|
||||
]
|
||||
sendMessage(help_string, context.bot, update.message)
|
||||
|
||||
def main():
|
||||
# bot.set_my_commands(botcmds)
|
||||
start_cleanup()
|
||||
notifier_dict = False
|
||||
if INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
if notifier_dict := DbManger().get_incomplete_tasks():
|
||||
for cid, data in notifier_dict.items():
|
||||
|
@ -127,35 +127,19 @@ def get_readable_message():
|
||||
msg += f"\n<b>Status:</b> <i>{download.status()}</i>"
|
||||
if download.status() not in [MirrorStatus.STATUS_SPLITTING, MirrorStatus.STATUS_SEEDING]:
|
||||
msg += f"\n{get_progress_bar_string(download)} {download.progress()}"
|
||||
if download.status() in [MirrorStatus.STATUS_DOWNLOADING,
|
||||
MirrorStatus.STATUS_WAITING,
|
||||
MirrorStatus.STATUS_PAUSED]:
|
||||
msg += f"\n<b>Downloaded:</b> {get_readable_file_size(download.processed_bytes())} of {download.size()}"
|
||||
elif download.status() == MirrorStatus.STATUS_UPLOADING:
|
||||
msg += f"\n<b>Uploaded:</b> {get_readable_file_size(download.processed_bytes())} of {download.size()}"
|
||||
elif download.status() == MirrorStatus.STATUS_CLONING:
|
||||
msg += f"\n<b>Cloned:</b> {get_readable_file_size(download.processed_bytes())} of {download.size()}"
|
||||
elif download.status() == MirrorStatus.STATUS_ARCHIVING:
|
||||
msg += f"\n<b>Archived:</b> {get_readable_file_size(download.processed_bytes())} of {download.size()}"
|
||||
elif download.status() == MirrorStatus.STATUS_EXTRACTING:
|
||||
msg += f"\n<b>Extracted:</b> {get_readable_file_size(download.processed_bytes())} of {download.size()}"
|
||||
msg += f"\n<b>Processed:</b> {get_readable_file_size(download.processed_bytes())} of {download.size()}"
|
||||
msg += f"\n<b>Speed:</b> {download.speed()} | <b>ETA:</b> {download.eta()}"
|
||||
try:
|
||||
msg += f"\n<b>Seeders:</b> {download.aria_download().num_seeders}" \
|
||||
f" | <b>Peers:</b> {download.aria_download().connections}"
|
||||
except:
|
||||
pass
|
||||
try:
|
||||
msg += f"\n<b>Seeders:</b> {download.torrent_info().num_seeds}" \
|
||||
f" | <b>Leechers:</b> {download.torrent_info().num_leechs}"
|
||||
except:
|
||||
pass
|
||||
if hasattr(download, 'seeders_num'):
|
||||
try:
|
||||
msg += f"\n<b>Seeders:</b> {download.seeders_num()} | <b>Leechers:</b> {download.leechers_num()}"
|
||||
except:
|
||||
pass
|
||||
elif download.status() == MirrorStatus.STATUS_SEEDING:
|
||||
msg += f"\n<b>Size: </b>{download.size()}"
|
||||
msg += f"\n<b>Speed: </b>{get_readable_file_size(download.torrent_info().upspeed)}/s"
|
||||
msg += f" | <b>Uploaded: </b>{get_readable_file_size(download.torrent_info().uploaded)}"
|
||||
msg += f"\n<b>Ratio: </b>{round(download.torrent_info().ratio, 3)}"
|
||||
msg += f" | <b>Time: </b>{get_readable_time(download.torrent_info().seeding_time)}"
|
||||
msg += f"\n<b>Speed: </b>{download.upload_speed()}"
|
||||
msg += f" | <b>Uploaded: </b>{download.uploaded_bytes()}"
|
||||
msg += f"\n<b>Ratio: </b>{download.ratio()}"
|
||||
msg += f" | <b>Time: </b>{download.seeding_time()}"
|
||||
else:
|
||||
msg += f"\n<b>Size: </b>{download.size()}"
|
||||
msg += f"\n<code>/{BotCommands.CancelMirror} {download.gid()}</code>"
|
||||
@ -180,6 +164,12 @@ def get_readable_message():
|
||||
upspeed_bytes += float(spd.split('K')[0]) * 1024
|
||||
elif 'MB/s' in spd:
|
||||
upspeed_bytes += float(spd.split('M')[0]) * 1048576
|
||||
elif download.status() == MirrorStatus.STATUS_SEEDING:
|
||||
spd = download.upload_speed()
|
||||
if 'K' in spd:
|
||||
upspeed_bytes += float(spd.split('K')[0]) * 1024
|
||||
elif 'M' in spd:
|
||||
upspeed_bytes += float(spd.split('M')[0]) * 1048576
|
||||
bmsg += f"\n<b>DL:</b> {get_readable_file_size(dlspeed_bytes)}/s | <b>UL:</b> {get_readable_file_size(upspeed_bytes)}/s"
|
||||
if STATUS_LIMIT is not None and tasks > STATUS_LIMIT:
|
||||
msg += f"<b>Page:</b> {PAGE_NO}/{pages} | <b>Tasks:</b> {tasks}\n"
|
||||
|
@ -10,12 +10,7 @@ from math import ceil
|
||||
from re import split as re_split, I
|
||||
|
||||
from .exceptions import NotSupportedExtractionArchive
|
||||
from bot import aria2, app, LOGGER, DOWNLOAD_DIR, get_client, LEECH_SPLIT_SIZE, EQUAL_SPLITS, IS_PREMIUM_USER
|
||||
|
||||
if IS_PREMIUM_USER:
|
||||
MAX_SPLIT_SIZE = 4194304000
|
||||
else:
|
||||
MAX_SPLIT_SIZE = 2097152000
|
||||
from bot import aria2, app, LOGGER, DOWNLOAD_DIR, get_client, LEECH_SPLIT_SIZE, EQUAL_SPLITS, IS_PREMIUM_USER, MAX_SPLIT_SIZE
|
||||
|
||||
VIDEO_SUFFIXES = ("M4V", "MP4", "MOV", "FLV", "WMV", "3GP", "MPG", "WEBM", "MKV", "AVI")
|
||||
|
||||
@ -24,6 +19,19 @@ ARCH_EXT = [".tar.bz2", ".tar.gz", ".bz2", ".gz", ".tar.xz", ".tar", ".tbz2", ".
|
||||
".cpio", ".cramfs", ".deb", ".dmg", ".fat", ".hfs", ".lzh", ".lzma", ".mbr",
|
||||
".msi", ".mslz", ".nsis", ".ntfs", ".rpm", ".squashfs", ".udf", ".vhd", ".xar"]
|
||||
|
||||
def clean_target(path: str):
|
||||
if ospath.exists(path):
|
||||
LOGGER.info(f"Cleaning Target: {path}")
|
||||
if ospath.isdir(path):
|
||||
try:
|
||||
rmtree(path)
|
||||
except:
|
||||
pass
|
||||
elif ospath.isfile(path):
|
||||
try:
|
||||
osremove(path)
|
||||
except:
|
||||
pass
|
||||
|
||||
def clean_download(path: str):
|
||||
if ospath.exists(path):
|
||||
@ -62,11 +70,10 @@ def clean_unwanted(path: str):
|
||||
LOGGER.info(f"Cleaning unwanted files/folders: {path}")
|
||||
for dirpath, subdir, files in walk(path, topdown=False):
|
||||
for filee in files:
|
||||
if filee.endswith((".!qB", ".aria2")) or filee.endswith('.parts') and filee.startswith('.'):
|
||||
if filee.endswith(".!qB") or filee.endswith('.parts') and filee.startswith('.'):
|
||||
osremove(ospath.join(dirpath, filee))
|
||||
for folder in subdir:
|
||||
if folder == ".unwanted":
|
||||
rmtree(ospath.join(dirpath, folder))
|
||||
if dirpath.endswith((".unwanted", "splited_files_mltb")):
|
||||
rmtree(dirpath)
|
||||
for dirpath, subdir, files in walk(path, topdown=False):
|
||||
if not listdir(dirpath):
|
||||
rmdir(dirpath)
|
||||
@ -117,6 +124,9 @@ def take_ss(video_file):
|
||||
return des_dir
|
||||
|
||||
def split_file(path, size, file_, dirpath, split_size, listener, start_time=0, i=1, inLoop=False, noMap=False):
|
||||
if listener.seed and not listener.newDir:
|
||||
dirpath = f"{dirpath}/splited_files_mltb"
|
||||
mkdir(dirpath)
|
||||
parts = ceil(size/LEECH_SPLIT_SIZE)
|
||||
if EQUAL_SPLITS and not inLoop:
|
||||
split_size = ceil(size/parts) + 1000
|
||||
@ -129,20 +139,29 @@ def split_file(path, size, file_, dirpath, split_size, listener, start_time=0, i
|
||||
out_path = ospath.join(dirpath, parted_name)
|
||||
if not noMap:
|
||||
listener.suproc = Popen(["ffmpeg", "-hide_banner", "-loglevel", "error", "-ss", str(start_time),
|
||||
"-i", path, "-fs", str(split_size), "-map", "0", "-map_chapters", "-1", "-c", "copy", out_path])
|
||||
"-i", path, "-fs", str(split_size), "-map", "0", "-map_chapters", "-1",
|
||||
"-c", "copy", out_path])
|
||||
else:
|
||||
listener.suproc = Popen(["ffmpeg", "-hide_banner", "-loglevel", "error", "-ss", str(start_time),
|
||||
"-i", path, "-fs", str(split_size), "-map_chapters", "-1", "-c", "copy", out_path])
|
||||
"-i", path, "-fs", str(split_size), "-map_chapters", "-1", "-c", "copy",
|
||||
out_path])
|
||||
listener.suproc.wait()
|
||||
if listener.suproc.returncode == -9:
|
||||
return False
|
||||
elif listener.suproc.returncode != 0 and not noMap:
|
||||
LOGGER.warning(f'Retrying without map, -map 0 not working in all situations. Path: {path}')
|
||||
LOGGER.warning(f"Retrying without map, -map 0 not working in all situations. Path: {path}")
|
||||
try:
|
||||
osremove(out_path)
|
||||
except:
|
||||
pass
|
||||
return split_file(path, size, file_, dirpath, split_size, listener, start_time, i, True, True)
|
||||
elif listener.suproc.returncode != 0:
|
||||
LOGGER.warning(f"Unable to split this video, if it's size less than {MAX_SPLIT_SIZE} will be uploaded as it is. Path: {path}")
|
||||
try:
|
||||
osremove(out_path)
|
||||
except:
|
||||
pass
|
||||
return "errored"
|
||||
out_size = get_path_size(out_path)
|
||||
if out_size > MAX_SPLIT_SIZE:
|
||||
dif = out_size - MAX_SPLIT_SIZE
|
||||
@ -163,7 +182,8 @@ def split_file(path, size, file_, dirpath, split_size, listener, start_time=0, i
|
||||
i = i + 1
|
||||
else:
|
||||
out_path = ospath.join(dirpath, file_ + ".")
|
||||
listener.suproc = Popen(["split", "--numeric-suffixes=1", "--suffix-length=3", f"--bytes={split_size}", path, out_path])
|
||||
listener.suproc = Popen(["split", "--numeric-suffixes=1", "--suffix-length=3",
|
||||
f"--bytes={split_size}", path, out_path])
|
||||
listener.suproc.wait()
|
||||
if listener.suproc.returncode == -9:
|
||||
return False
|
||||
|
125
bot/helper/ext_utils/html_helper.py
Normal file
125
bot/helper/ext_utils/html_helper.py
Normal file
@ -0,0 +1,125 @@
|
||||
hmtl_content = """
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>{fileName}</title>
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" />
|
||||
<link
|
||||
href="https://fonts.googleapis.com/css2?family=Ubuntu:ital,wght@0,300;0,400;0,500;0,700;1,300;1,400;1,500;1,700&display=swap"
|
||||
rel="stylesheet" />
|
||||
<link rel="stylesheet" href="https://pro.fontawesome.com/releases/v5.10.0/css/all.css" />
|
||||
<style>
|
||||
* {
|
||||
font-family: "Ubuntu", sans-serif;
|
||||
list-style: none;
|
||||
text-decoration: none;
|
||||
outline: none !important;
|
||||
color: white;
|
||||
overflow-wrap: anywhere;
|
||||
}
|
||||
body {
|
||||
background-color: #0D1117;
|
||||
}
|
||||
.container {
|
||||
margin: 0vh 1vw;
|
||||
margin-bottom: 1vh;
|
||||
padding: 1vh 3vw;
|
||||
display: list-item;
|
||||
flex-direction: column;
|
||||
border: 2px solid rgba(255, 255, 255, 0.11);
|
||||
border-radius: 20px;
|
||||
background-color: #161b22;
|
||||
align-items: center;
|
||||
}
|
||||
.container.center {
|
||||
text-align: center;
|
||||
}
|
||||
.container.start {
|
||||
text-align: start;
|
||||
}
|
||||
.rfontsize {
|
||||
font-size: 1rem;
|
||||
}
|
||||
.forhover:hover {
|
||||
filter: invert(0.3);
|
||||
}
|
||||
.dlinks {
|
||||
margin-top: 2.5vh;
|
||||
display: inline-block;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
{msg}
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
|
||||
html_template = """
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>{title}</title>
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" />
|
||||
<link
|
||||
href="https://fonts.googleapis.com/css2?family=Ubuntu:ital,wght@0,300;0,400;0,500;0,700;1,300;1,400;1,500;1,700&display=swap"
|
||||
rel="stylesheet" />
|
||||
<link rel="stylesheet" href="https://pro.fontawesome.com/releases/v5.10.0/css/all.css" />
|
||||
<style>
|
||||
* {
|
||||
font-family: "Ubuntu", sans-serif;
|
||||
list-style: none;
|
||||
text-decoration: none;
|
||||
outline: none !important;
|
||||
color: white;
|
||||
overflow-wrap: anywhere;
|
||||
}
|
||||
body {
|
||||
background-color: #0D1117;
|
||||
}
|
||||
.container {
|
||||
margin: 0vh 1vw;
|
||||
margin-bottom: 1vh;
|
||||
padding: 1vh 3vw;
|
||||
display: list-item;
|
||||
flex-direction: column;
|
||||
border: 2px solid rgba(255, 255, 255, 0.11);
|
||||
border-radius: 20px;
|
||||
background-color: #161b22;
|
||||
align-items: center;
|
||||
}
|
||||
.container.center {
|
||||
text-align: center;
|
||||
}
|
||||
.container.start {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: flex-start;
|
||||
}
|
||||
.rfontsize {
|
||||
font-size: 1rem;
|
||||
}
|
||||
.withhover:hover {
|
||||
filter: invert(0.3);
|
||||
}
|
||||
.topmarginxl {
|
||||
margin-top: 2.5vh;
|
||||
display: inline-block;
|
||||
}
|
||||
.topmarginsm {
|
||||
margin-top: 1vh;
|
||||
display: inline-block;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
{msg}
|
||||
</body>
|
||||
</html>
|
||||
"""
|
@ -1,81 +0,0 @@
|
||||
# Implement By - @VarnaX-279
|
||||
|
||||
from string import ascii_letters
|
||||
from random import SystemRandom
|
||||
|
||||
from time import sleep
|
||||
from telegraph import Telegraph
|
||||
from telegraph.exceptions import RetryAfterError
|
||||
|
||||
from bot import LOGGER
|
||||
|
||||
|
||||
class TelegraphHelper:
|
||||
def __init__(self, author_name=None, author_url=None):
|
||||
self.telegraph = Telegraph()
|
||||
self.short_name = ''.join(SystemRandom().choices(ascii_letters, k=8))
|
||||
self.access_token = None
|
||||
self.author_name = author_name
|
||||
self.author_url = author_url
|
||||
self.create_account()
|
||||
|
||||
def create_account(self):
|
||||
self.telegraph.create_account(
|
||||
short_name=self.short_name,
|
||||
author_name=self.author_name,
|
||||
author_url=self.author_url
|
||||
)
|
||||
self.access_token = self.telegraph.get_access_token()
|
||||
LOGGER.info("Creating Telegraph Account")
|
||||
|
||||
def create_page(self, title, content):
|
||||
try:
|
||||
return self.telegraph.create_page(
|
||||
title = title,
|
||||
author_name=self.author_name,
|
||||
author_url=self.author_url,
|
||||
html_content=content
|
||||
)
|
||||
except RetryAfterError as st:
|
||||
LOGGER.warning(f'Telegraph Flood control exceeded. I will sleep for {st.retry_after} seconds.')
|
||||
sleep(st.retry_after)
|
||||
return self.create_page(title, content)
|
||||
|
||||
def edit_page(self, path, title, content):
|
||||
try:
|
||||
return self.telegraph.edit_page(
|
||||
path = path,
|
||||
title = title,
|
||||
author_name=self.author_name,
|
||||
author_url=self.author_url,
|
||||
html_content=content
|
||||
)
|
||||
except RetryAfterError as st:
|
||||
LOGGER.warning(f'Telegraph Flood control exceeded. I will sleep for {st.retry_after} seconds.')
|
||||
sleep(st.retry_after)
|
||||
return self.edit_page(path, title, content)
|
||||
|
||||
def edit_telegraph(self, path, telegraph_content):
|
||||
nxt_page = 1
|
||||
prev_page = 0
|
||||
num_of_path = len(path)
|
||||
for content in telegraph_content :
|
||||
if nxt_page == 1 :
|
||||
content += f'<b><a href="https://telegra.ph/{path[nxt_page]}">Next</a></b>'
|
||||
nxt_page += 1
|
||||
else :
|
||||
if prev_page <= num_of_path:
|
||||
content += f'<b><a href="https://telegra.ph/{path[prev_page]}">Prev</a></b>'
|
||||
prev_page += 1
|
||||
if nxt_page < num_of_path:
|
||||
content += f'<b> | <a href="https://telegra.ph/{path[nxt_page]}">Next</a></b>'
|
||||
nxt_page += 1
|
||||
self.edit_page(
|
||||
path = path[prev_page],
|
||||
title = 'Mirror-leech-bot Torrent Search',
|
||||
content=content
|
||||
)
|
||||
return
|
||||
|
||||
|
||||
telegraph=TelegraphHelper('Mirror-Leech-Telegram-Bot', 'https://github.com/anasty17/mirror-leech-telegram-bot')
|
@ -1,10 +1,10 @@
|
||||
from time import sleep
|
||||
from time import sleep, time
|
||||
|
||||
from bot import aria2, download_dict_lock, download_dict, STOP_DUPLICATE, BASE_URL, LOGGER
|
||||
from bot.helper.mirror_utils.upload_utils.gdriveTools import GoogleDriveHelper
|
||||
from bot.helper.ext_utils.bot_utils import is_magnet, getDownloadByGid, new_thread, bt_selection_buttons
|
||||
from bot.helper.mirror_utils.status_utils.aria_download_status import AriaDownloadStatus
|
||||
from bot.helper.telegram_helper.message_utils import sendMarkup, sendStatusMessage, sendMessage, deleteMessage
|
||||
from bot.helper.telegram_helper.message_utils import sendMarkup, sendStatusMessage, sendMessage, deleteMessage, update_all_messages
|
||||
from bot.helper.ext_utils.fs_utils import get_base_name, clean_unwanted
|
||||
|
||||
|
||||
@ -12,34 +12,43 @@ from bot.helper.ext_utils.fs_utils import get_base_name, clean_unwanted
|
||||
def __onDownloadStarted(api, gid):
|
||||
download = api.get_download(gid)
|
||||
if download.is_metadata:
|
||||
LOGGER.info(f'onDownloadStarted: {gid} Metadata')
|
||||
LOGGER.info(f'onDownloadStarted: {gid} METADATA')
|
||||
sleep(1)
|
||||
dl = getDownloadByGid(gid)
|
||||
if dl.listener().select:
|
||||
listener = dl.listener()
|
||||
if listener.select:
|
||||
metamsg = "Downloading Metadata, wait then you can select files. Use torrent file to avoid this wait."
|
||||
meta = sendMessage(metamsg, dl.listener().bot, dl.listener().message)
|
||||
meta = sendMessage(metamsg, listener.bot, listener.message)
|
||||
while True:
|
||||
download = api.get_download(gid)
|
||||
if download.followed_by_ids:
|
||||
deleteMessage(dl.listener().bot, meta)
|
||||
try:
|
||||
download = api.get_download(gid)
|
||||
except:
|
||||
deleteMessage(listener.bot, meta)
|
||||
break
|
||||
if download.followed_by_ids:
|
||||
deleteMessage(listener.bot, meta)
|
||||
break
|
||||
sleep(1)
|
||||
return
|
||||
else:
|
||||
LOGGER.info(f'onDownloadStarted: {gid}')
|
||||
LOGGER.info(f'onDownloadStarted: {download.name} - Gid: {gid}')
|
||||
try:
|
||||
if STOP_DUPLICATE:
|
||||
sleep(1)
|
||||
dl = getDownloadByGid(gid)
|
||||
if not dl:
|
||||
return
|
||||
listener = dl.listener()
|
||||
if listener.isLeech or listener.select:
|
||||
return
|
||||
download = api.get_download(gid)
|
||||
if not download.is_torrent:
|
||||
sleep(3)
|
||||
download = api.get_download(gid)
|
||||
dl = getDownloadByGid(gid)
|
||||
if not dl or dl.listener().isLeech:
|
||||
return
|
||||
download = download.live
|
||||
LOGGER.info('Checking File/Folder if already in Drive...')
|
||||
sname = download.name
|
||||
if dl.listener().isZip:
|
||||
if listener.isZip:
|
||||
sname = sname + ".zip"
|
||||
elif dl.listener().extract:
|
||||
elif listener.extract:
|
||||
try:
|
||||
sname = get_base_name(sname)
|
||||
except:
|
||||
@ -47,30 +56,76 @@ def __onDownloadStarted(api, gid):
|
||||
if sname is not None:
|
||||
smsg, button = GoogleDriveHelper().drive_list(sname, True)
|
||||
if smsg:
|
||||
dl.listener().onDownloadError('File/Folder already available in Drive.\n\n')
|
||||
listener.onDownloadError('File/Folder already available in Drive.\n\n')
|
||||
api.remove([download], force=True, files=True)
|
||||
return sendMarkup("Here are the search results:", dl.listener().bot, dl.listener().message, button)
|
||||
return sendMarkup("Here are the search results:", listener.bot, listener.message, button)
|
||||
except Exception as e:
|
||||
LOGGER.error(f"{e} onDownloadStart: {gid} check duplicate didn't pass")
|
||||
|
||||
@new_thread
|
||||
def __onDownloadComplete(api, gid):
|
||||
download = api.get_download(gid)
|
||||
try:
|
||||
download = api.get_download(gid)
|
||||
except:
|
||||
return
|
||||
if download.followed_by_ids:
|
||||
new_gid = download.followed_by_ids[0]
|
||||
LOGGER.info(f'Gid changed from {gid} to {new_gid}')
|
||||
if BASE_URL is not None:
|
||||
dl = getDownloadByGid(new_gid)
|
||||
if dl and dl.listener().select:
|
||||
api.client.force_pause(new_gid)
|
||||
SBUTTONS = bt_selection_buttons(new_gid)
|
||||
msg = "Your download paused. Choose files then press Done Selecting button to start downloading."
|
||||
sendMarkup(msg, dl.listener().bot, dl.listener().message, SBUTTONS)
|
||||
elif dl := getDownloadByGid(gid):
|
||||
LOGGER.info(f"onDownloadComplete: {gid}")
|
||||
if dl.listener().select:
|
||||
clean_unwanted(dl.path())
|
||||
dl.listener().onDownloadComplete()
|
||||
dl = getDownloadByGid(new_gid)
|
||||
listener = dl.listener()
|
||||
if BASE_URL is not None and listener.select:
|
||||
SBUTTONS = bt_selection_buttons(new_gid)
|
||||
msg = "Your download paused. Choose files then press Done Selecting button to start downloading."
|
||||
sendMarkup(msg, listener.bot, listener.message, SBUTTONS)
|
||||
elif download.is_torrent:
|
||||
if dl := getDownloadByGid(gid):
|
||||
if hasattr(dl, 'listener'):
|
||||
listener = dl.listener()
|
||||
if hasattr(listener, 'uploaded'):
|
||||
LOGGER.info(f"Cancelling Seed: {download.name} onDownloadComplete")
|
||||
listener.onUploadError(f"Seeding stopped with Ratio: {dl.ratio()} and Time: {dl.seeding_time()}")
|
||||
api.remove([download], force=True, files=True)
|
||||
else:
|
||||
LOGGER.info(f"onDownloadComplete: {download.name} - Gid: {gid}")
|
||||
if dl := getDownloadByGid(gid):
|
||||
dl.listener().onDownloadComplete()
|
||||
api.remove([download], force=True, files=True)
|
||||
|
||||
@new_thread
|
||||
def __onBtDownloadComplete(api, gid):
|
||||
seed_start_time = time()
|
||||
sleep(1)
|
||||
download = api.get_download(gid)
|
||||
LOGGER.info(f"onBtDownloadComplete: {download.name} - Gid: {gid}")
|
||||
if dl := getDownloadByGid(gid):
|
||||
listener = dl.listener()
|
||||
if listener.select:
|
||||
clean_unwanted(download.dir)
|
||||
if listener.seed:
|
||||
try:
|
||||
api.set_options({'max-upload-limit': '0'}, [download])
|
||||
except Exception as e:
|
||||
LOGGER.error(f'{e} You are not able to seed because you added global option seed-time=0 without adding specific seed_time for this torrent')
|
||||
listener.onDownloadComplete()
|
||||
if listener.seed:
|
||||
with download_dict_lock:
|
||||
if listener.uid not in download_dict:
|
||||
api.remove([download], force=True, files=True)
|
||||
return
|
||||
download_dict[listener.uid] = AriaDownloadStatus(gid, listener)
|
||||
download_dict[listener.uid].start_time = seed_start_time
|
||||
LOGGER.info(f"Seeding started: {download.name} - Gid: {gid}")
|
||||
download = download.live
|
||||
if download.is_complete:
|
||||
if dl := getDownloadByGid(gid):
|
||||
LOGGER.info(f"Cancelling Seed: {download.name}")
|
||||
listener.onUploadError(f"Seeding stopped with Ratio: {dl.ratio()} and Time: {dl.seeding_time()}")
|
||||
api.remove([download], force=True, files=True)
|
||||
else:
|
||||
listener.uploaded = True
|
||||
update_all_messages()
|
||||
else:
|
||||
api.remove([download], force=True, files=True)
|
||||
|
||||
@new_thread
|
||||
def __onDownloadStopped(api, gid):
|
||||
@ -81,6 +136,7 @@ def __onDownloadStopped(api, gid):
|
||||
@new_thread
|
||||
def __onDownloadError(api, gid):
|
||||
LOGGER.info(f"onDownloadError: {gid}")
|
||||
error = "None"
|
||||
try:
|
||||
download = api.get_download(gid)
|
||||
error = download.error_message
|
||||
@ -96,20 +152,30 @@ def start_listener():
|
||||
on_download_error=__onDownloadError,
|
||||
on_download_stop=__onDownloadStopped,
|
||||
on_download_complete=__onDownloadComplete,
|
||||
timeout=30)
|
||||
on_bt_download_complete=__onBtDownloadComplete,
|
||||
timeout=60)
|
||||
|
||||
def add_aria2c_download(link: str, path, listener, filename, auth, select):
|
||||
def add_aria2c_download(link: str, path, listener, filename, auth, select, ratio, seed_time):
|
||||
args = {'dir': path, 'max-upload-limit': '1K'}
|
||||
if filename:
|
||||
args['out'] = filename
|
||||
if auth:
|
||||
args['header'] = f"authorization: {auth}"
|
||||
if ratio:
|
||||
args['seed-ratio'] = ratio
|
||||
if seed_time:
|
||||
args['seed-time'] = seed_time
|
||||
if is_magnet(link):
|
||||
download = aria2.add_magnet(link, {'dir': path})
|
||||
download = aria2.add_magnet(link, args)
|
||||
else:
|
||||
download = aria2.add_uris([link], {'dir': path, 'out': filename, 'header': f"authorization: {auth}"})
|
||||
download = aria2.add_uris([link], args)
|
||||
if download.error_message:
|
||||
error = str(download.error_message).replace('<', ' ').replace('>', ' ')
|
||||
LOGGER.info(f"Download Error: {error}")
|
||||
return sendMessage(error, listener.bot, listener.message)
|
||||
with download_dict_lock:
|
||||
download_dict[listener.uid] = AriaDownloadStatus(download.gid, listener)
|
||||
LOGGER.info(f"Started: {download.gid} DIR: {download.dir} ")
|
||||
LOGGER.info(f"Aria2Download started: {download.gid}")
|
||||
listener.onDownloadStart()
|
||||
if not select:
|
||||
sendStatusMessage(listener.message, listener.bot)
|
||||
|
@ -28,7 +28,7 @@ fmed_list = ['fembed.net', 'fembed.com', 'femax20.com', 'fcdn.stream', 'feurl.co
|
||||
def direct_link_generator(link: str):
|
||||
""" direct links generator """
|
||||
if 'youtube.com' in link or 'youtu.be' in link:
|
||||
raise DirectDownloadLinkException(f"ERROR: Use watch cmds for Youtube links")
|
||||
raise DirectDownloadLinkException(f"ERROR: Use ytdl cmds for Youtube links")
|
||||
elif 'yadi.sk' in link or 'disk.yandex.com' in link:
|
||||
return yandex_disk(link)
|
||||
elif 'mediafire.com' in link:
|
||||
|
@ -8,7 +8,7 @@ from bot.helper.telegram_helper.message_utils import sendMessage, sendStatusMess
|
||||
from bot.helper.ext_utils.fs_utils import get_base_name
|
||||
|
||||
|
||||
def add_gd_download(link, listener, newname):
|
||||
def add_gd_download(link, path, listener, newname):
|
||||
res, size, name, files = GoogleDriveHelper().helper(link)
|
||||
if res != "":
|
||||
return sendMessage(res, listener.bot, listener.message)
|
||||
@ -29,7 +29,7 @@ def add_gd_download(link, listener, newname):
|
||||
msg = "File/Folder is already available in Drive.\nHere are the search results:"
|
||||
return sendMarkup(msg, listener.bot, listener.message, button)
|
||||
LOGGER.info(f"Download Name: {name}")
|
||||
drive = GoogleDriveHelper(name, listener)
|
||||
drive = GoogleDriveHelper(name, path, size, listener)
|
||||
gid = ''.join(SystemRandom().choices(ascii_letters + digits, k=12))
|
||||
download_status = GdDownloadStatus(drive, size, listener, gid)
|
||||
with download_dict_lock:
|
||||
|
@ -76,7 +76,7 @@ class MegaAppListener(MegaListener):
|
||||
LOGGER.error(f'Mega Request error in {error}')
|
||||
if not self.is_cancelled:
|
||||
self.is_cancelled = True
|
||||
self.listener.onDownloadError("RequestTempError: " + error.toString())
|
||||
self.listener.onDownloadError(f"RequestTempError: {error.toString()}")
|
||||
self.error = error.toString()
|
||||
self.continue_event.set()
|
||||
|
||||
|
@ -30,7 +30,7 @@ class QbDownloader:
|
||||
self.__dupChecked = False
|
||||
self.__rechecked = False
|
||||
|
||||
def add_qb_torrent(self, link, path, select):
|
||||
def add_qb_torrent(self, link, path, select, ratio, seed_time):
|
||||
self.__path = path
|
||||
self.select = select
|
||||
self.client = get_client()
|
||||
@ -44,9 +44,9 @@ class QbDownloader:
|
||||
sendMessage("This Torrent already added!", self.__listener.bot, self.__listener.message)
|
||||
return self.client.auth_log_out()
|
||||
if link.startswith('magnet:'):
|
||||
op = self.client.torrents_add(link, save_path=path)
|
||||
op = self.client.torrents_add(link, save_path=path, ratio_limit=ratio, seeding_time_limit=seed_time)
|
||||
else:
|
||||
op = self.client.torrents_add(torrent_files=[link], save_path=path)
|
||||
op = self.client.torrents_add(torrent_files=[link], save_path=path, ratio_limit=ratio, seeding_time_limit=seed_time)
|
||||
sleep(0.3)
|
||||
if op.lower() == "ok.":
|
||||
tor_info = self.client.torrents_info(torrent_hashes=self.ext_hash)
|
||||
@ -109,7 +109,7 @@ class QbDownloader:
|
||||
self.__onDownloadError("Dead Torrent!")
|
||||
elif tor_info.state == "downloading":
|
||||
self.__stalled_time = time()
|
||||
if not self.__dupChecked and STOP_DUPLICATE and ospath.isdir(f'{self.__path}') and not self.__listener.isLeech and not self.select:
|
||||
if not self.select and not self.__dupChecked and STOP_DUPLICATE and not self.__listener.isLeech and ospath.isdir(f'{self.__path}'):
|
||||
LOGGER.info('Checking File/Folder if already in Drive')
|
||||
qbname = str(listdir(f'{self.__path}')[-1])
|
||||
if qbname.endswith('.!qB'):
|
||||
@ -149,26 +149,20 @@ class QbDownloader:
|
||||
if self.select:
|
||||
clean_unwanted(self.__path)
|
||||
self.__listener.onDownloadComplete()
|
||||
if self.__listener.seed and not self.__listener.isLeech and not self.__listener.extract:
|
||||
if self.__listener.seed:
|
||||
with download_dict_lock:
|
||||
if self.__listener.uid not in download_dict:
|
||||
self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True)
|
||||
self.client.auth_log_out()
|
||||
self.__periodic.cancel()
|
||||
self.__remove_torrent()
|
||||
return
|
||||
download_dict[self.__listener.uid] = QbDownloadStatus(self.__listener, self)
|
||||
self.is_seeding = True
|
||||
update_all_messages()
|
||||
LOGGER.info(f"Seeding started: {self.__name}")
|
||||
LOGGER.info(f"Seeding started: {self.__name} - Hash: {self.ext_hash}")
|
||||
else:
|
||||
self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True)
|
||||
self.client.auth_log_out()
|
||||
self.__periodic.cancel()
|
||||
self.__remove_torrent()
|
||||
elif tor_info.state == 'pausedUP' and self.__listener.seed:
|
||||
self.__listener.onUploadError(f"Seeding stopped with Ratio: {round(tor_info.ratio, 3)} and Time: {get_readable_time(tor_info.seeding_time)}")
|
||||
self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True)
|
||||
self.client.auth_log_out()
|
||||
self.__periodic.cancel()
|
||||
self.__remove_torrent()
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
|
||||
@ -177,6 +171,9 @@ class QbDownloader:
|
||||
self.client.torrents_pause(torrent_hashes=self.ext_hash)
|
||||
sleep(0.3)
|
||||
self.__listener.onDownloadError(err)
|
||||
self.__remove_torrent()
|
||||
|
||||
def __remove_torrent(self):
|
||||
self.client.torrents_delete(torrent_hashes=self.ext_hash, delete_files=True)
|
||||
self.client.auth_log_out()
|
||||
self.__periodic.cancel()
|
||||
|
@ -173,8 +173,8 @@ class YoutubeDLHelper:
|
||||
if len(audio_info) == 2:
|
||||
rate = audio_info[1]
|
||||
else:
|
||||
rate = 320
|
||||
self.opts['postprocessors'] = [{'key': 'FFmpegExtractAudio','preferredcodec': 'mp3','preferredquality': f'{rate}'}]
|
||||
rate = '320'
|
||||
self.opts['postprocessors'] = [{'key': 'FFmpegExtractAudio', 'preferredcodec': 'mp3', 'preferredquality': rate}]
|
||||
self.opts['format'] = qual
|
||||
LOGGER.info(f"Downloading with YT-DLP: {link}")
|
||||
self.extractMetaData(link, name, args)
|
||||
|
@ -1,5 +1,7 @@
|
||||
from bot import aria2, DOWNLOAD_DIR, LOGGER
|
||||
from bot.helper.ext_utils.bot_utils import MirrorStatus
|
||||
from time import time
|
||||
|
||||
from bot import aria2, LOGGER
|
||||
from bot.helper.ext_utils.bot_utils import MirrorStatus, get_readable_time
|
||||
|
||||
def get_download(gid):
|
||||
try:
|
||||
@ -14,15 +16,14 @@ class AriaDownloadStatus:
|
||||
self.__gid = gid
|
||||
self.__download = get_download(gid)
|
||||
self.__listener = listener
|
||||
self.start_time = 0
|
||||
self.message = listener.message
|
||||
|
||||
def path(self):
|
||||
return f'{DOWNLOAD_DIR}{self.__listener.uid}'
|
||||
|
||||
def __update(self):
|
||||
self.__download = get_download(self.__gid)
|
||||
self.__download = self.__download.live
|
||||
if self.__download.followed_by_ids:
|
||||
self.__gid = self.__download.followed_by_ids[0]
|
||||
self.__download = get_download(self.__gid)
|
||||
|
||||
def progress(self):
|
||||
"""
|
||||
@ -61,11 +62,28 @@ class AriaDownloadStatus:
|
||||
return MirrorStatus.STATUS_WAITING
|
||||
elif download.is_paused:
|
||||
return MirrorStatus.STATUS_PAUSED
|
||||
elif download.seeder and hasattr(self.__listener, 'uploaded'):
|
||||
return MirrorStatus.STATUS_SEEDING
|
||||
else:
|
||||
return MirrorStatus.STATUS_DOWNLOADING
|
||||
|
||||
def aria_download(self):
|
||||
return self.__download
|
||||
def seeders_num(self):
|
||||
return self.__download.num_seeders
|
||||
|
||||
def leechers_num(self):
|
||||
return self.__download.connections
|
||||
|
||||
def uploaded_bytes(self):
|
||||
return self.__download.upload_length_string()
|
||||
|
||||
def upload_speed(self):
|
||||
return self.__download.upload_speed_string()
|
||||
|
||||
def ratio(self):
|
||||
return f"{round(self.__download.upload_length / self.__download.completed_length, 3)}"
|
||||
|
||||
def seeding_time(self):
|
||||
return f"{get_readable_time(time() - self.start_time)}"
|
||||
|
||||
def download(self):
|
||||
return self
|
||||
@ -78,18 +96,17 @@ class AriaDownloadStatus:
|
||||
return self.__gid
|
||||
|
||||
def cancel_download(self):
|
||||
LOGGER.info(f"Cancelling Download: {self.name()}")
|
||||
self.__update()
|
||||
download = self.__download
|
||||
if download.is_waiting:
|
||||
self.__listener.onDownloadError("Cancelled by user")
|
||||
aria2.remove([download], force=True, files=True)
|
||||
return
|
||||
if len(download.followed_by_ids) != 0:
|
||||
downloads = aria2.get_downloads(download.followed_by_ids)
|
||||
if self.__download.seeder:
|
||||
LOGGER.info(f"Cancelling Seed: {self.name}")
|
||||
self.__listener.onUploadError(f"Seeding stopped with Ratio: {self.ratio()} and Time: {self.seeding_time()}")
|
||||
aria2.remove([self.__download], force=True, files=True)
|
||||
elif len(self.__download.followed_by_ids) != 0:
|
||||
LOGGER.info(f"Cancelling Download: {self.name()}")
|
||||
downloads = aria2.get_downloads(self.__download.followed_by_ids)
|
||||
self.__listener.onDownloadError('Download stopped by user!')
|
||||
aria2.remove(downloads, force=True, files=True)
|
||||
aria2.remove([download], force=True, files=True)
|
||||
return
|
||||
self.__listener.onDownloadError('Download stopped by user!')
|
||||
aria2.remove([download], force=True, files=True)
|
||||
else:
|
||||
LOGGER.info(f"Cancelling Download: {self.name()}")
|
||||
self.__listener.onDownloadError('Download stopped by user!')
|
||||
aria2.remove([self.__download], force=True, files=True)
|
||||
|
@ -53,7 +53,10 @@ class ExtractStatus:
|
||||
return MirrorStatus.STATUS_EXTRACTING
|
||||
|
||||
def processed_bytes(self):
|
||||
return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size
|
||||
if self.__listener.newDir:
|
||||
return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}10000")
|
||||
else:
|
||||
return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size
|
||||
|
||||
def download(self):
|
||||
return self
|
||||
|
@ -9,7 +9,7 @@ class GdDownloadStatus:
|
||||
self.message = listener.message
|
||||
|
||||
def processed_bytes(self):
|
||||
return self.__obj.downloaded_bytes
|
||||
return self.__obj.processed_bytes
|
||||
|
||||
def size_raw(self):
|
||||
return self.__size
|
||||
@ -28,7 +28,7 @@ class GdDownloadStatus:
|
||||
|
||||
def progress_raw(self):
|
||||
try:
|
||||
return self.__obj.downloaded_bytes / self.__size * 100
|
||||
return self.__obj.processed_bytes / self.__size * 100
|
||||
except:
|
||||
return 0
|
||||
|
||||
@ -39,14 +39,14 @@ class GdDownloadStatus:
|
||||
"""
|
||||
:return: Download speed in Bytes/Seconds
|
||||
"""
|
||||
return self.__obj.dspeed()
|
||||
return self.__obj.speed()
|
||||
|
||||
def speed(self):
|
||||
return f'{get_readable_file_size(self.speed_raw())}/s'
|
||||
|
||||
def eta(self):
|
||||
try:
|
||||
seconds = (self.__size - self.__obj.downloaded_bytes) / self.speed_raw()
|
||||
seconds = (self.__size - self.__obj.processed_bytes) / self.speed_raw()
|
||||
return f'{get_readable_time(seconds)}'
|
||||
except:
|
||||
return '-'
|
||||
|
@ -70,8 +70,23 @@ class QbDownloadStatus:
|
||||
else:
|
||||
return MirrorStatus.STATUS_DOWNLOADING
|
||||
|
||||
def torrent_info(self):
|
||||
return self.__info
|
||||
def seeders_num(self):
|
||||
return self.__info.num_seeds
|
||||
|
||||
def leechers_num(self):
|
||||
return self.__info.num_leechs
|
||||
|
||||
def uploaded_bytes(self):
|
||||
return f"{get_readable_file_size(self.__info.uploaded)}"
|
||||
|
||||
def upload_speed(self):
|
||||
return f"{get_readable_file_size(self.__info.upspeed)}/s"
|
||||
|
||||
def ratio(self):
|
||||
return f"{round(self.__info.ratio, 3)}"
|
||||
|
||||
def seeding_time(self):
|
||||
return f"{get_readable_time(self.__info.seeding_time)}"
|
||||
|
||||
def download(self):
|
||||
return self.__obj
|
||||
|
@ -9,7 +9,7 @@ class UploadStatus:
|
||||
self.message = listener.message
|
||||
|
||||
def processed_bytes(self):
|
||||
return self.__obj.uploaded_bytes
|
||||
return self.__obj.processed_bytes
|
||||
|
||||
def size_raw(self):
|
||||
return self.__size
|
||||
@ -25,7 +25,7 @@ class UploadStatus:
|
||||
|
||||
def progress_raw(self):
|
||||
try:
|
||||
return self.__obj.uploaded_bytes / self.__size * 100
|
||||
return self.__obj.processed_bytes / self.__size * 100
|
||||
except ZeroDivisionError:
|
||||
return 0
|
||||
|
||||
@ -43,7 +43,7 @@ class UploadStatus:
|
||||
|
||||
def eta(self):
|
||||
try:
|
||||
seconds = (self.__size - self.__obj.uploaded_bytes) / self.speed_raw()
|
||||
seconds = (self.__size - self.__obj.processed_bytes) / self.speed_raw()
|
||||
return f'{get_readable_time(seconds)}'
|
||||
except ZeroDivisionError:
|
||||
return '-'
|
||||
|
@ -53,7 +53,10 @@ class ZipStatus:
|
||||
return MirrorStatus.STATUS_ARCHIVING
|
||||
|
||||
def processed_bytes(self):
|
||||
return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size
|
||||
if self.__listener.newDir:
|
||||
return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}10000")
|
||||
else:
|
||||
return get_path_size(f"{DOWNLOAD_DIR}{self.__uid}") - self.__size
|
||||
|
||||
def download(self):
|
||||
return self
|
||||
|
@ -2,7 +2,7 @@ from logging import getLogger, ERROR
|
||||
from time import time
|
||||
from pickle import load as pload
|
||||
from json import loads as jsnloads
|
||||
from os import makedirs, path as ospath, listdir
|
||||
from os import makedirs, path as ospath, listdir, remove as osremove
|
||||
from requests.utils import quote as rquote
|
||||
from io import FileIO
|
||||
from re import search as re_search
|
||||
@ -16,11 +16,10 @@ from telegram import InlineKeyboardMarkup
|
||||
from tenacity import retry, wait_exponential, stop_after_attempt, retry_if_exception_type, RetryError
|
||||
|
||||
from bot.helper.telegram_helper.button_build import ButtonMaker
|
||||
from bot import parent_id, DOWNLOAD_DIR, IS_TEAM_DRIVE, INDEX_URL, USE_SERVICE_ACCOUNTS, VIEW_LINK, \
|
||||
from bot import parent_id, IS_TEAM_DRIVE, INDEX_URL, USE_SERVICE_ACCOUNTS, VIEW_LINK, \
|
||||
DRIVES_NAMES, DRIVES_IDS, INDEX_URLS, EXTENSION_FILTER
|
||||
from bot.helper.ext_utils.telegraph_helper import telegraph
|
||||
from bot.helper.ext_utils.bot_utils import get_readable_file_size, setInterval
|
||||
from bot.helper.ext_utils.fs_utils import get_mime_type, get_path_size
|
||||
from bot.helper.ext_utils.fs_utils import get_mime_type
|
||||
|
||||
LOGGER = getLogger(__name__)
|
||||
getLogger('googleapiclient.discovery').setLevel(ERROR)
|
||||
@ -31,40 +30,36 @@ if USE_SERVICE_ACCOUNTS:
|
||||
|
||||
class GoogleDriveHelper:
|
||||
|
||||
def __init__(self, name=None, listener=None):
|
||||
def __init__(self, name=None, path=None, size=0, listener=None):
|
||||
self.__G_DRIVE_TOKEN_FILE = "token.pickle"
|
||||
# Check https://developers.google.com/drive/scopes for all available scopes
|
||||
self.__OAUTH_SCOPE = ['https://www.googleapis.com/auth/drive']
|
||||
# Redirect URI for installed apps, can be left as is
|
||||
self.__REDIRECT_URI = "urn:ietf:wg:oauth:2.0:oob"
|
||||
self.__G_DRIVE_DIR_MIME_TYPE = "application/vnd.google-apps.folder"
|
||||
self.__G_DRIVE_BASE_DOWNLOAD_URL = "https://drive.google.com/uc?id={}&export=download"
|
||||
self.__G_DRIVE_DIR_BASE_DOWNLOAD_URL = "https://drive.google.com/drive/folders/{}"
|
||||
self.__listener = listener
|
||||
self.__path = path
|
||||
self.__service = self.__authorize()
|
||||
self._file_uploaded_bytes = 0
|
||||
self._file_downloaded_bytes = 0
|
||||
self.uploaded_bytes = 0
|
||||
self.downloaded_bytes = 0
|
||||
self.start_time = 0
|
||||
self.total_time = 0
|
||||
self.dtotal_time = 0
|
||||
self.is_uploading = False
|
||||
self.is_downloading = False
|
||||
self.is_cloning = False
|
||||
self.is_cancelled = False
|
||||
self.is_errored = False
|
||||
self.status = None
|
||||
self.dstatus = None
|
||||
self.updater = None
|
||||
self.name = name
|
||||
self.update_interval = 3
|
||||
self.__total_bytes = 0
|
||||
self.__total_files = 0
|
||||
self.__total_folders = 0
|
||||
self.transferred_size = 0
|
||||
self.__sa_count = 0
|
||||
self.alt_auth = False
|
||||
self.__start_time = 0
|
||||
self.__total_time = 0
|
||||
self.__alt_auth = False
|
||||
self.__is_uploading = False
|
||||
self.__is_downloading = False
|
||||
self.__is_cloning = False
|
||||
self.__is_cancelled = False
|
||||
self.__is_errored = False
|
||||
self.__status = None
|
||||
self.__updater = None
|
||||
self.__update_interval = 3
|
||||
self.__size = size
|
||||
self._file_processed_bytes = 0
|
||||
self.name = name
|
||||
self.processed_bytes = 0
|
||||
self.transferred_size = 0
|
||||
|
||||
|
||||
def speed(self):
|
||||
"""
|
||||
@ -72,19 +67,13 @@ class GoogleDriveHelper:
|
||||
:return: Upload speed in bytes/second
|
||||
"""
|
||||
try:
|
||||
return self.uploaded_bytes / self.total_time
|
||||
except:
|
||||
return 0
|
||||
|
||||
def dspeed(self):
|
||||
try:
|
||||
return self.downloaded_bytes / self.dtotal_time
|
||||
return self.processed_bytes / self.__total_time
|
||||
except:
|
||||
return 0
|
||||
|
||||
def cspeed(self):
|
||||
try:
|
||||
return self.transferred_size / int(time() - self.start_time)
|
||||
return self.transferred_size / int(time() - self.__start_time)
|
||||
except:
|
||||
return 0
|
||||
|
||||
@ -99,12 +88,12 @@ class GoogleDriveHelper:
|
||||
parsed = urlparse(link)
|
||||
return parse_qs(parsed.query)['id'][0]
|
||||
|
||||
def _on_upload_progress(self):
|
||||
if self.status is not None:
|
||||
chunk_size = self.status.total_size * self.status.progress() - self._file_uploaded_bytes
|
||||
self._file_uploaded_bytes = self.status.total_size * self.status.progress()
|
||||
self.uploaded_bytes += chunk_size
|
||||
self.total_time += self.update_interval
|
||||
def _progress(self):
|
||||
if self.__status is not None:
|
||||
chunk_size = self.__status.total_size * self.__status.progress() - self._file_processed_bytes
|
||||
self._file_processed_bytes = self.__status.total_size * self.__status.progress()
|
||||
self.processed_bytes += chunk_size
|
||||
self.__total_time += self.__update_interval
|
||||
|
||||
def deletefile(self, link: str):
|
||||
try:
|
||||
@ -189,10 +178,10 @@ class GoogleDriveHelper:
|
||||
body=file_metadata, media_body=media_body)
|
||||
response = None
|
||||
while response is None:
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
break
|
||||
try:
|
||||
self.status, response = drive_file.next_chunk()
|
||||
self.__status, response = drive_file.next_chunk()
|
||||
except HttpError as err:
|
||||
if err.resp.get('content-type', '').startswith('application/json'):
|
||||
reason = jsnloads(err.content).get('error').get('errors')[0].get('reason')
|
||||
@ -208,9 +197,14 @@ class GoogleDriveHelper:
|
||||
else:
|
||||
LOGGER.error(f"Got: {reason}")
|
||||
raise err
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
return
|
||||
self._file_uploaded_bytes = 0
|
||||
if not self.__listener.seed or self.__listener.newDir:
|
||||
try:
|
||||
osremove(file_path)
|
||||
except:
|
||||
pass
|
||||
self._file_processed_bytes = 0
|
||||
# Insert new permissions
|
||||
if not IS_TEAM_DRIVE:
|
||||
self.__set_permission(response['id'])
|
||||
@ -220,22 +214,20 @@ class GoogleDriveHelper:
|
||||
return download_url
|
||||
|
||||
def upload(self, file_name: str):
|
||||
self.is_downloading = False
|
||||
self.is_uploading = True
|
||||
file_dir = f"{DOWNLOAD_DIR}{self.__listener.message.message_id}"
|
||||
file_path = f"{file_dir}/{file_name}"
|
||||
size = get_readable_file_size(get_path_size(file_path))
|
||||
LOGGER.info("Uploading File: " + file_path)
|
||||
self.updater = setInterval(self.update_interval, self._on_upload_progress)
|
||||
self.__is_uploading = True
|
||||
file_path = f"{self.__path}/{file_name}"
|
||||
size = get_readable_file_size(self.__size)
|
||||
LOGGER.info(f"Uploading File: {file_path}")
|
||||
self.__updater = setInterval(self.__update_interval, self._progress)
|
||||
try:
|
||||
if ospath.isfile(file_path):
|
||||
mime_type = get_mime_type(file_path)
|
||||
link = self.__upload_file(file_path, file_name, mime_type, parent_id)
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
return
|
||||
if link is None:
|
||||
raise Exception('Upload has been manually cancelled')
|
||||
LOGGER.info("Uploaded To G-Drive: " + file_path)
|
||||
LOGGER.info(f"Uploaded To G-Drive: {file_path}")
|
||||
else:
|
||||
mime_type = 'Folder'
|
||||
dir_id = self.__create_directory(ospath.basename(ospath.abspath(file_name)), parent_id)
|
||||
@ -243,24 +235,24 @@ class GoogleDriveHelper:
|
||||
if result is None:
|
||||
raise Exception('Upload has been manually cancelled!')
|
||||
link = f"https://drive.google.com/folderview?id={dir_id}"
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
return
|
||||
LOGGER.info("Uploaded To G-Drive: " + file_name)
|
||||
LOGGER.info(f"Uploaded To G-Drive: {file_name}")
|
||||
except Exception as err:
|
||||
if isinstance(err, RetryError):
|
||||
LOGGER.info(f"Total Attempts: {err.last_attempt.attempt_number}")
|
||||
err = err.last_attempt.exception()
|
||||
self.__listener.onUploadError(str(err))
|
||||
self.is_errored = True
|
||||
self.__is_errored = True
|
||||
finally:
|
||||
self.updater.cancel()
|
||||
if self.is_cancelled and not self.is_errored:
|
||||
self.__updater.cancel()
|
||||
if self.__is_cancelled and not self.__is_errored:
|
||||
if mime_type == 'Folder':
|
||||
LOGGER.info("Deleting uploaded data from Drive...")
|
||||
link = f"https://drive.google.com/folderview?id={dir_id}"
|
||||
self.deletefile(link)
|
||||
return
|
||||
elif self.is_errored:
|
||||
elif self.__is_errored:
|
||||
return
|
||||
self.__listener.onUploadComplete(link, size, self.__total_files, self.__total_folders, mime_type, self.name)
|
||||
|
||||
@ -284,13 +276,13 @@ class GoogleDriveHelper:
|
||||
if reason in ['userRateLimitExceeded', 'dailyLimitExceeded']:
|
||||
if USE_SERVICE_ACCOUNTS:
|
||||
if self.__sa_count == len(listdir("accounts")) or self.__sa_count > 50:
|
||||
self.is_cancelled = True
|
||||
self.__is_cancelled = True
|
||||
raise err
|
||||
else:
|
||||
self.__switchServiceAccount()
|
||||
return self.__copyFile(file_id, dest_id)
|
||||
else:
|
||||
self.is_cancelled = True
|
||||
self.__is_cancelled = True
|
||||
LOGGER.error(f"Got: {reason}")
|
||||
raise err
|
||||
else:
|
||||
@ -324,8 +316,8 @@ class GoogleDriveHelper:
|
||||
return files
|
||||
|
||||
def clone(self, link):
|
||||
self.is_cloning = True
|
||||
self.start_time = time()
|
||||
self.__is_cloning = True
|
||||
self.__start_time = time()
|
||||
self.__total_files = 0
|
||||
self.__total_folders = 0
|
||||
try:
|
||||
@ -342,7 +334,7 @@ class GoogleDriveHelper:
|
||||
dir_id = self.__create_directory(meta.get('name'), parent_id)
|
||||
self.__cloneFolder(meta.get('name'), meta.get('name'), meta.get('id'), dir_id)
|
||||
durl = self.__G_DRIVE_DIR_BASE_DOWNLOAD_URL.format(dir_id)
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
LOGGER.info("Deleting cloned data from Drive...")
|
||||
self.deletefile(durl)
|
||||
return "your clone has been stopped and cloned data has been deleted!", "cancelled"
|
||||
@ -407,7 +399,7 @@ class GoogleDriveHelper:
|
||||
self.__total_files += 1
|
||||
self.transferred_size += int(file.get('size', 0))
|
||||
self.__copyFile(file.get('id'), parent_id)
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
break
|
||||
|
||||
@retry(wait=wait_exponential(multiplier=2, min=3, max=6), stop=stop_after_attempt(3),
|
||||
@ -445,7 +437,7 @@ class GoogleDriveHelper:
|
||||
self.__upload_file(current_file_name, file_name, mime_type, parent_id)
|
||||
self.__total_files += 1
|
||||
new_id = parent_id
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
break
|
||||
return new_id
|
||||
|
||||
@ -467,8 +459,8 @@ class GoogleDriveHelper:
|
||||
|
||||
def __alt_authorize(self):
|
||||
credentials = None
|
||||
if USE_SERVICE_ACCOUNTS and not self.alt_auth:
|
||||
self.alt_auth = True
|
||||
if USE_SERVICE_ACCOUNTS and not self.__alt_auth:
|
||||
self.__alt_auth = True
|
||||
if ospath.exists(self.__G_DRIVE_TOKEN_FILE):
|
||||
LOGGER.info("Authorize with token.pickle")
|
||||
with open(self.__G_DRIVE_TOKEN_FILE, 'rb') as f:
|
||||
@ -479,7 +471,7 @@ class GoogleDriveHelper:
|
||||
def __escapes(self, str):
|
||||
chars = ['\\', "'", '"', r'\a', r'\b', r'\f', r'\n', r'\r', r'\t']
|
||||
for char in chars:
|
||||
str = str.replace(char, '\\' + char)
|
||||
str = str.replace(char, f'\\{char}')
|
||||
return str.strip()
|
||||
|
||||
def __get_recursive_list(self, file, rootid):
|
||||
@ -522,7 +514,7 @@ class GoogleDriveHelper:
|
||||
if parent_id == "root":
|
||||
return (
|
||||
self.__service.files()
|
||||
.list(q=query + " and 'me' in owners",
|
||||
.list(q=f"{query} and 'me' in owners",
|
||||
pageSize=200,
|
||||
spaces='drive',
|
||||
fields='files(id, name, mimeType, size, parents)',
|
||||
@ -581,92 +573,78 @@ class GoogleDriveHelper:
|
||||
msg = ""
|
||||
fileName = self.__escapes(str(fileName))
|
||||
contents_count = 0
|
||||
telegraph_content = []
|
||||
path = []
|
||||
Title = False
|
||||
if len(DRIVES_IDS) > 1:
|
||||
token_service = self.__alt_authorize()
|
||||
if token_service is not None:
|
||||
self.__service = token_service
|
||||
for index, parent_id in enumerate(DRIVES_IDS):
|
||||
if isRecursive and len(parent_id) > 23:
|
||||
isRecur = False
|
||||
else:
|
||||
isRecur = isRecursive
|
||||
isRecur = False if isRecursive and len(parent_id) > 23 else isRecursive
|
||||
response = self.__drive_query(parent_id, fileName, stopDup, isRecur, itemType)
|
||||
if not response["files"] and noMulti:
|
||||
break
|
||||
elif not response["files"]:
|
||||
continue
|
||||
if not response["files"]:
|
||||
if noMulti:
|
||||
break
|
||||
else:
|
||||
continue
|
||||
if not Title:
|
||||
msg += f'<h4>Search Result For {fileName}</h4>'
|
||||
msg += '<span class="container center rfontsize">' \
|
||||
f'<h4>Search Result For {fileName}</h4></span>'
|
||||
Title = True
|
||||
if len(DRIVES_NAMES) > 1 and DRIVES_NAMES[index] is not None:
|
||||
msg += f"╾────────────╼<br><b>{DRIVES_NAMES[index]}</b><br>╾────────────╼<br>"
|
||||
msg += '<span class="container center rfontsize">' \
|
||||
f'<b>{DRIVES_NAMES[index]}</b></span>'
|
||||
for file in response.get('files', []):
|
||||
mime_type = file.get('mimeType')
|
||||
if mime_type == "application/vnd.google-apps.folder":
|
||||
furl = f"https://drive.google.com/drive/folders/{file.get('id')}"
|
||||
msg += f"📁 <code>{file.get('name')}<br>(folder)</code><br>"
|
||||
msg += f"<b><a href={furl}>Drive Link</a></b>"
|
||||
msg += '<span class="container start rfontsize">' \
|
||||
f"<div>📁 {file.get('name')} (folder)</div>" \
|
||||
'<div class="dlinks">' \
|
||||
f'<span> <a class="forhover" href="{furl}">Drive Link</a></span>'
|
||||
if INDEX_URLS[index] is not None:
|
||||
if isRecur:
|
||||
url_path = "/".join([rquote(n, safe='') for n in self.__get_recursive_list(file, parent_id)])
|
||||
else:
|
||||
url_path = rquote(f'{file.get("name")}', safe='')
|
||||
url = f'{INDEX_URLS[index]}/{url_path}/'
|
||||
msg += f' <b>| <a href="{url}">Index Link</a></b>'
|
||||
msg += '<span> | </span>' \
|
||||
f'<span> <a class="forhover" href="{url}">Index Link</a></span>'
|
||||
elif mime_type == 'application/vnd.google-apps.shortcut':
|
||||
msg += f"⁍<a href='https://drive.google.com/drive/folders/{file.get('id')}'>{file.get('name')}" \
|
||||
f"</a> (shortcut)"
|
||||
# Excluded index link as indexes cant download or open these shortcuts
|
||||
furl = f"https://drive.google.com/drive/folders/{file.get('id')}"
|
||||
msg += '<span class="container start rfontsize">' \
|
||||
f"<div>📁 {file.get('name')} (shortcut)</div>" \
|
||||
'<div class="dlinks">' \
|
||||
f'<span> <a class="forhover" href="{furl}">Drive Link</a></span>'\
|
||||
'</div></span>'
|
||||
else:
|
||||
furl = f"https://drive.google.com/uc?id={file.get('id')}&export=download"
|
||||
msg += f"📄 <code>{file.get('name')}<br>({get_readable_file_size(int(file.get('size', 0)))})</code><br>"
|
||||
msg += f"<b><a href={furl}>Drive Link</a></b>"
|
||||
msg += '<span class="container start rfontsize">' \
|
||||
f"<div>📄 {file.get('name')} ({get_readable_file_size(int(file.get('size', 0)))})</div>" \
|
||||
'<div class="dlinks">' \
|
||||
f'<span> <a class="forhover" href="{furl}">Drive Link</a></span>'
|
||||
if INDEX_URLS[index] is not None:
|
||||
if isRecur:
|
||||
url_path = "/".join(
|
||||
rquote(n, safe='')
|
||||
for n in self.__get_recursive_list(file, parent_id)
|
||||
)
|
||||
|
||||
url_path = "/".join(rquote(n, safe='') for n in self.__get_recursive_list(file, parent_id))
|
||||
else:
|
||||
url_path = rquote(f'{file.get("name")}')
|
||||
url = f'{INDEX_URLS[index]}/{url_path}'
|
||||
msg += f' <b>| <a href="{url}">Index Link</a></b>'
|
||||
msg += '<span> | </span>' \
|
||||
f'<span> <a class="forhover" href="{url}">Index Link</a></span>'
|
||||
if VIEW_LINK:
|
||||
urlv = f'{INDEX_URLS[index]}/{url_path}?a=view'
|
||||
msg += f' <b>| <a href="{urlv}">View Link</a></b>'
|
||||
msg += '<br><br>'
|
||||
msg += '<span> | </span>' \
|
||||
f'<span> <a class="forhover" href="{urlv}">View Link</a></span>'
|
||||
msg += '</div></span>'
|
||||
contents_count += 1
|
||||
if len(msg.encode('utf-8')) > 39000:
|
||||
telegraph_content.append(msg)
|
||||
msg = ""
|
||||
if noMulti:
|
||||
break
|
||||
|
||||
if msg != '':
|
||||
telegraph_content.append(msg)
|
||||
if contents_count == 0:
|
||||
return "", ""
|
||||
|
||||
if len(telegraph_content) == 0:
|
||||
return "", None
|
||||
rmsg = f"<b>Found {contents_count} result for <i>{fileName}</i></b>"
|
||||
|
||||
for content in telegraph_content:
|
||||
path.append(
|
||||
telegraph.create_page(
|
||||
title='Mirror-Leech-Bot Drive Search',
|
||||
content=content
|
||||
)["path"]
|
||||
)
|
||||
if len(path) > 1:
|
||||
telegraph.edit_telegraph(path, telegraph_content)
|
||||
|
||||
msg = f"<b>Found {contents_count} result for <i>{fileName}</i></b>"
|
||||
buttons = ButtonMaker()
|
||||
buttons.buildbutton("🔎 VIEW", f"https://telegra.ph/{path[0]}")
|
||||
|
||||
return msg, InlineKeyboardMarkup(buttons.build_menu(1))
|
||||
return rmsg, msg
|
||||
|
||||
def count(self, link):
|
||||
try:
|
||||
@ -769,17 +747,16 @@ class GoogleDriveHelper:
|
||||
return "", size, name, files
|
||||
|
||||
def download(self, link):
|
||||
self.is_downloading = True
|
||||
self.__is_downloading = True
|
||||
file_id = self.__getIdFromUrl(link)
|
||||
self.updater = setInterval(self.update_interval, self._on_download_progress)
|
||||
self.__updater = setInterval(self.__update_interval, self._progress)
|
||||
try:
|
||||
meta = self.__getFileMetadata(file_id)
|
||||
path = f"{DOWNLOAD_DIR}{self.__listener.uid}/"
|
||||
if meta.get("mimeType") == self.__G_DRIVE_DIR_MIME_TYPE:
|
||||
self.__download_folder(file_id, path, self.name)
|
||||
self.__download_folder(file_id, self.__path, self.name)
|
||||
else:
|
||||
makedirs(path)
|
||||
self.__download_file(file_id, path, self.name, meta.get('mimeType'))
|
||||
self.__download_file(file_id, self.__path, self.name, meta.get('mimeType'))
|
||||
except Exception as err:
|
||||
if isinstance(err, RetryError):
|
||||
LOGGER.info(f"Total Attempts: {err.last_attempt.attempt_number}")
|
||||
@ -791,21 +768,21 @@ class GoogleDriveHelper:
|
||||
token_service = self.__alt_authorize()
|
||||
if token_service is not None:
|
||||
self.__service = token_service
|
||||
self.updater.cancel()
|
||||
self.__updater.cancel()
|
||||
return self.download(link)
|
||||
self.__listener.onDownloadError(err)
|
||||
self.is_cancelled = True
|
||||
self.__is_cancelled = True
|
||||
finally:
|
||||
self.updater.cancel()
|
||||
if self.is_cancelled:
|
||||
self.__updater.cancel()
|
||||
if self.__is_cancelled:
|
||||
return
|
||||
self.__listener.onDownloadComplete()
|
||||
|
||||
def __download_folder(self, folder_id, path, folder_name):
|
||||
folder_name = folder_name.replace('/', '')
|
||||
if not ospath.exists(path + folder_name):
|
||||
makedirs(path + folder_name)
|
||||
path += folder_name + '/'
|
||||
if not ospath.exists(f"{path}/{folder_name}"):
|
||||
makedirs(f"{path}/{folder_name}")
|
||||
path += f"/{folder_name}"
|
||||
result = self.__getFilesByFolderId(folder_id)
|
||||
if len(result) == 0:
|
||||
return
|
||||
@ -821,9 +798,9 @@ class GoogleDriveHelper:
|
||||
mime_type = item.get('mimeType')
|
||||
if mime_type == self.__G_DRIVE_DIR_MIME_TYPE:
|
||||
self.__download_folder(file_id, path, filename)
|
||||
elif not ospath.isfile(path + filename) and not filename.lower().endswith(tuple(EXTENSION_FILTER)):
|
||||
elif not ospath.isfile(f"{path}{filename}") and not filename.lower().endswith(tuple(EXTENSION_FILTER)):
|
||||
self.__download_file(file_id, path, filename, mime_type)
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
break
|
||||
|
||||
@retry(wait=wait_exponential(multiplier=2, min=3, max=6), stop=stop_after_attempt(3),
|
||||
@ -833,18 +810,18 @@ class GoogleDriveHelper:
|
||||
filename = filename.replace('/', '')
|
||||
if len(filename.encode()) > 255:
|
||||
ext = ospath.splitext(filename)[1]
|
||||
filename = filename[:245] + ext
|
||||
filename = f"{filename[:245]}{ext}"
|
||||
if self.name.endswith(ext):
|
||||
self.name = filename
|
||||
fh = FileIO(f"{path}{filename}", 'wb')
|
||||
fh = FileIO(f"{path}/{filename}", 'wb')
|
||||
downloader = MediaIoBaseDownload(fh, request, chunksize=50 * 1024 * 1024)
|
||||
done = False
|
||||
while not done:
|
||||
if self.is_cancelled:
|
||||
if self.__is_cancelled:
|
||||
fh.close()
|
||||
break
|
||||
try:
|
||||
self.dstatus, done = downloader.next_chunk()
|
||||
self.__status, done = downloader.next_chunk()
|
||||
except HttpError as err:
|
||||
if err.resp.get('content-type', '').startswith('application/json'):
|
||||
reason = jsnloads(err.content).get('error').get('errors')[0].get('reason')
|
||||
@ -855,7 +832,7 @@ class GoogleDriveHelper:
|
||||
raise err
|
||||
if USE_SERVICE_ACCOUNTS:
|
||||
if self.__sa_count == len(listdir("accounts")) or self.__sa_count > 50:
|
||||
self.is_cancelled = True
|
||||
self.__is_cancelled = True
|
||||
raise err
|
||||
else:
|
||||
self.__switchServiceAccount()
|
||||
@ -864,22 +841,15 @@ class GoogleDriveHelper:
|
||||
else:
|
||||
LOGGER.error(f"Got: {reason}")
|
||||
raise err
|
||||
self._file_downloaded_bytes = 0
|
||||
|
||||
def _on_download_progress(self):
|
||||
if self.dstatus is not None:
|
||||
chunk_size = self.dstatus.total_size * self.dstatus.progress() - self._file_downloaded_bytes
|
||||
self._file_downloaded_bytes = self.dstatus.total_size * self.dstatus.progress()
|
||||
self.downloaded_bytes += chunk_size
|
||||
self.dtotal_time += self.update_interval
|
||||
self._file_processed_bytes = 0
|
||||
|
||||
def cancel_download(self):
|
||||
self.is_cancelled = True
|
||||
if self.is_downloading:
|
||||
self.__is_cancelled = True
|
||||
if self.__is_downloading:
|
||||
LOGGER.info(f"Cancelling Download: {self.name}")
|
||||
self.__listener.onDownloadError('Download stopped by user!')
|
||||
elif self.is_cloning:
|
||||
elif self.__is_cloning:
|
||||
LOGGER.info(f"Cancelling Clone: {self.name}")
|
||||
elif self.is_uploading:
|
||||
elif self.__is_uploading:
|
||||
LOGGER.info(f"Cancelling Upload: {self.name}")
|
||||
self.__listener.onUploadError('your upload has been stopped and uploaded data has been deleted!')
|
||||
|
@ -1,16 +1,16 @@
|
||||
from logging import getLogger, WARNING
|
||||
from logging import getLogger, ERROR
|
||||
from os import remove as osremove, walk, path as ospath, rename as osrename
|
||||
from time import time, sleep
|
||||
from pyrogram.errors import FloodWait, RPCError
|
||||
from PIL import Image
|
||||
from threading import RLock
|
||||
|
||||
from bot import DOWNLOAD_DIR, AS_DOCUMENT, AS_DOC_USERS, AS_MEDIA_USERS, CUSTOM_FILENAME, EXTENSION_FILTER, app
|
||||
from bot.helper.ext_utils.fs_utils import take_ss, get_media_info, get_path_size
|
||||
from bot import AS_DOCUMENT, AS_DOC_USERS, AS_MEDIA_USERS, CUSTOM_FILENAME, EXTENSION_FILTER, app, LEECH_SPLIT_SIZE
|
||||
from bot.helper.ext_utils.fs_utils import take_ss, get_media_info, clean_unwanted
|
||||
from bot.helper.ext_utils.bot_utils import get_readable_file_size
|
||||
|
||||
LOGGER = getLogger(__name__)
|
||||
getLogger("pyrogram").setLevel(WARNING)
|
||||
getLogger("pyrogram").setLevel(ERROR)
|
||||
|
||||
VIDEO_SUFFIXES = ("MKV", "MP4", "MOV", "WMV", "3GP", "MPG", "WEBM", "AVI", "FLV", "M4V", "GIF")
|
||||
AUDIO_SUFFIXES = ("MP3", "M4A", "M4B", "FLAC", "WAV", "AIF", "OGG", "AAC", "DTS", "MID", "AMR", "MKA")
|
||||
@ -19,11 +19,12 @@ IMAGE_SUFFIXES = ("JPG", "JPX", "PNG", "CR2", "TIF", "BMP", "JXR", "PSD", "ICO",
|
||||
|
||||
class TgUploader:
|
||||
|
||||
def __init__(self, name=None, listener=None):
|
||||
def __init__(self, name=None, path=None, size=0, listener=None):
|
||||
self.name = name
|
||||
self.uploaded_bytes = 0
|
||||
self._last_uploaded = 0
|
||||
self.__listener = listener
|
||||
self.__path = path
|
||||
self.__start_time = time()
|
||||
self.__total_files = 0
|
||||
self.__is_cancelled = False
|
||||
@ -34,16 +35,17 @@ class TgUploader:
|
||||
self.__resource_lock = RLock()
|
||||
self.__is_corrupted = False
|
||||
self.__sent_msg = app.get_messages(self.__listener.message.chat.id, self.__listener.uid)
|
||||
self.__size = size
|
||||
self.__user_settings()
|
||||
|
||||
def upload(self):
|
||||
path = f"{DOWNLOAD_DIR}{self.__listener.uid}"
|
||||
size = get_readable_file_size(get_path_size(path))
|
||||
for dirpath, subdir, files in sorted(walk(path)):
|
||||
def upload(self, o_files):
|
||||
for dirpath, subdir, files in sorted(walk(self.__path)):
|
||||
for file_ in sorted(files):
|
||||
if file_ in o_files:
|
||||
continue
|
||||
if not file_.lower().endswith(tuple(EXTENSION_FILTER)):
|
||||
self.__total_files += 1
|
||||
up_path = ospath.join(dirpath, file_)
|
||||
self.__total_files += 1
|
||||
if ospath.getsize(up_path) == 0:
|
||||
LOGGER.error(f"{up_path} size is zero, telegram don't upload zero size files")
|
||||
self.__corrupted += 1
|
||||
@ -55,9 +57,12 @@ class TgUploader:
|
||||
self.__msgs_dict[self.__sent_msg.link] = file_
|
||||
self._last_uploaded = 0
|
||||
sleep(1)
|
||||
if self.__listener.seed and not self.__listener.newDir:
|
||||
clean_unwanted(self.__path)
|
||||
if self.__total_files <= self.__corrupted:
|
||||
return self.__listener.onUploadError('Files Corrupted. Check logs')
|
||||
LOGGER.info(f"Leech Completed: {self.name}")
|
||||
size = get_readable_file_size(self.__size)
|
||||
self.__listener.onUploadComplete(None, size, self.__msgs_dict, self.__total_files, self.__corrupted, self.name)
|
||||
|
||||
def __upload_file(self, up_path, file_, dirpath):
|
||||
@ -149,8 +154,12 @@ class TgUploader:
|
||||
self.__is_corrupted = True
|
||||
if self.__thumb is None and thumb is not None and ospath.lexists(thumb):
|
||||
osremove(thumb)
|
||||
if not self.__is_cancelled:
|
||||
osremove(up_path)
|
||||
if not self.__is_cancelled and \
|
||||
(not self.__listener.seed or self.__listener.newDir or dirpath.endswith("splited_files_mltb")):
|
||||
try:
|
||||
osremove(up_path)
|
||||
except:
|
||||
pass
|
||||
|
||||
def __upload_progress(self, current, total):
|
||||
if self.__is_cancelled:
|
||||
|
@ -4,9 +4,25 @@ from bot import CMD_INDEX
|
||||
class _BotCommands:
|
||||
def __init__(self):
|
||||
self.StartCommand = f'start{CMD_INDEX}'
|
||||
self.MirrorCommand = f'mirror{CMD_INDEX}'
|
||||
self.UnzipMirrorCommand = f'unzipmirror{CMD_INDEX}'
|
||||
self.ZipMirrorCommand = f'zipmirror{CMD_INDEX}'
|
||||
self.MirrorCommand = (f'mirror{CMD_INDEX}', f'm{CMD_INDEX}')
|
||||
self.UnzipMirrorCommand = (f'unzipmirror{CMD_INDEX}', f'uzm{CMD_INDEX}')
|
||||
self.ZipMirrorCommand = (f'zipmirror{CMD_INDEX}', f'zm{CMD_INDEX}')
|
||||
self.QbMirrorCommand = (f'qbmirror{CMD_INDEX}', f'qm{CMD_INDEX}')
|
||||
self.QbUnzipMirrorCommand = (f'qbunzipmirror{CMD_INDEX}', f'quzm{CMD_INDEX}')
|
||||
self.QbZipMirrorCommand = (f'qbzipmirror{CMD_INDEX}', f'qzm{CMD_INDEX}')
|
||||
self.YtdlCommand = (f'ytdl{CMD_INDEX}', f'y{CMD_INDEX}')
|
||||
self.YtdlZipCommand = (f'ytdlzip{CMD_INDEX}', f'yz{CMD_INDEX}')
|
||||
self.LeechCommand = (f'leech{CMD_INDEX}', f'l{CMD_INDEX}')
|
||||
self.UnzipLeechCommand = (f'unzipleech{CMD_INDEX}', f'uzl{CMD_INDEX}')
|
||||
self.ZipLeechCommand = (f'zipleech{CMD_INDEX}', f'zl{CMD_INDEX}')
|
||||
self.QbLeechCommand = (f'qbleech{CMD_INDEX}', f'ql{CMD_INDEX}')
|
||||
self.QbUnzipLeechCommand = (f'qbunzipleech{CMD_INDEX}', f'quzl{CMD_INDEX}')
|
||||
self.QbZipLeechCommand = (f'qbzipleech{CMD_INDEX}', f'qzl{CMD_INDEX}')
|
||||
self.YtdlLeechCommand = (f'ytdlleech{CMD_INDEX}', f'yl{CMD_INDEX}')
|
||||
self.YtdlZipLeechCommand = (f'ytdlzipleech{CMD_INDEX}', f'yzl{CMD_INDEX}')
|
||||
self.CloneCommand = f'clone{CMD_INDEX}'
|
||||
self.CountCommand = f'count{CMD_INDEX}'
|
||||
self.DeleteCommand = f'del{CMD_INDEX}'
|
||||
self.CancelMirror = f'cancel{CMD_INDEX}'
|
||||
self.CancelAllCommand = f'cancelall{CMD_INDEX}'
|
||||
self.ListCommand = f'list{CMD_INDEX}'
|
||||
@ -22,34 +38,17 @@ class _BotCommands:
|
||||
self.StatsCommand = f'stats{CMD_INDEX}'
|
||||
self.HelpCommand = f'help{CMD_INDEX}'
|
||||
self.LogCommand = f'log{CMD_INDEX}'
|
||||
self.CloneCommand = f'clone{CMD_INDEX}'
|
||||
self.CountCommand = f'count{CMD_INDEX}'
|
||||
self.WatchCommand = f'watch{CMD_INDEX}'
|
||||
self.ZipWatchCommand = f'zipwatch{CMD_INDEX}'
|
||||
self.QbMirrorCommand = f'qbmirror{CMD_INDEX}'
|
||||
self.QbUnzipMirrorCommand = f'qbunzipmirror{CMD_INDEX}'
|
||||
self.QbZipMirrorCommand = f'qbzipmirror{CMD_INDEX}'
|
||||
self.DeleteCommand = f'del{CMD_INDEX}'
|
||||
self.ShellCommand = f'shell{CMD_INDEX}'
|
||||
self.ExecHelpCommand = f'exechelp{CMD_INDEX}'
|
||||
self.LeechSetCommand = f'leechset{CMD_INDEX}'
|
||||
self.SetThumbCommand = f'setthumb{CMD_INDEX}'
|
||||
self.LeechCommand = f'leech{CMD_INDEX}'
|
||||
self.UnzipLeechCommand = f'unzipleech{CMD_INDEX}'
|
||||
self.ZipLeechCommand = f'zipleech{CMD_INDEX}'
|
||||
self.QbLeechCommand = f'qbleech{CMD_INDEX}'
|
||||
self.QbUnzipLeechCommand = f'qbunzipleech{CMD_INDEX}'
|
||||
self.QbZipLeechCommand = f'qbzipleech{CMD_INDEX}'
|
||||
self.LeechWatchCommand = f'leechwatch{CMD_INDEX}'
|
||||
self.LeechZipWatchCommand = f'leechzipwatch{CMD_INDEX}'
|
||||
self.BtSelectCommand = f'btsel{CMD_INDEX}'
|
||||
self.RssListCommand = f'rsslist{CMD_INDEX}'
|
||||
self.RssGetCommand = f'rssget{CMD_INDEX}'
|
||||
self.RssSubCommand = f'rsssub{CMD_INDEX}'
|
||||
self.RssUnSubCommand = f'rssunsub{CMD_INDEX}'
|
||||
self.RssSettingsCommand = f'rssset{CMD_INDEX}'
|
||||
self.EvalCommand = f'eval{CMD_INDEX}'
|
||||
self.ExecCommand = f'exec{CMD_INDEX}'
|
||||
self.ClearLocalsCommand = f'clearlocals{CMD_INDEX}'
|
||||
self.LeechSetCommand = f'leechset{CMD_INDEX}'
|
||||
self.SetThumbCommand = f'setthumb{CMD_INDEX}'
|
||||
self.BtSelectCommand = f'btsel{CMD_INDEX}'
|
||||
self.RssListCommand = (f'rsslist{CMD_INDEX}', f'rl{CMD_INDEX}')
|
||||
self.RssGetCommand = (f'rssget{CMD_INDEX}', f'rg{CMD_INDEX}')
|
||||
self.RssSubCommand = (f'rsssub{CMD_INDEX}', f'rs{CMD_INDEX}')
|
||||
self.RssUnSubCommand = (f'rssunsub{CMD_INDEX}', f'rus{CMD_INDEX}')
|
||||
self.RssSettingsCommand = (f'rssset{CMD_INDEX}', f'rst{CMD_INDEX}')
|
||||
|
||||
BotCommands = _BotCommands()
|
||||
|
@ -85,6 +85,19 @@ def sendLogFile(bot, message: Message):
|
||||
reply_to_message_id=message.message_id,
|
||||
chat_id=message.chat_id)
|
||||
|
||||
def sendFile(bot, message: Message, name: str, caption=""):
|
||||
with open(name, 'rb') as f:
|
||||
try:
|
||||
bot.sendDocument(document=f, filename=f.name, reply_to_message_id=message.message_id,
|
||||
caption=caption, parse_mode='HTMl',chat_id=message.chat_id)
|
||||
except RetryAfter as r:
|
||||
LOGGER.warning(str(r))
|
||||
sleep(r.retry_after * 1.5)
|
||||
return sendFile(bot, message, name, caption)
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
return
|
||||
|
||||
def auto_delete_message(bot, cmd_message: Message, bot_message: Message):
|
||||
if AUTO_DELETE_MESSAGE_DURATION != -1:
|
||||
sleep(AUTO_DELETE_MESSAGE_DURATION)
|
||||
|
@ -112,22 +112,10 @@ def clear(update, context):
|
||||
del namespaces[update.message.chat_id]
|
||||
send("Cleared locals.", bot, update)
|
||||
|
||||
def exechelp(update, context):
|
||||
help_string = f'''
|
||||
<b>Executor</b>
|
||||
• {BotCommands.EvalCommand} <i>Run Python Code Line | Lines</i>
|
||||
• {BotCommands.ExecCommand} <i>Run Commands In Exec</i>
|
||||
• {BotCommands.ClearLocalsCommand} <i>Cleared locals</i>
|
||||
'''
|
||||
sendMessage(help_string, context.bot, update.message)
|
||||
|
||||
|
||||
EVAL_HANDLER = CommandHandler(BotCommands.EvalCommand, evaluate, filters=CustomFilters.owner_filter, run_async=True)
|
||||
EXEC_HANDLER = CommandHandler(BotCommands.ExecCommand, execute, filters=CustomFilters.owner_filter, run_async=True)
|
||||
CLEAR_HANDLER = CommandHandler(BotCommands.ClearLocalsCommand, clear, filters=CustomFilters.owner_filter, run_async=True)
|
||||
EXECHELP_HANDLER = CommandHandler(BotCommands.ExecHelpCommand, exechelp, filters=CustomFilters.owner_filter, run_async=True)
|
||||
|
||||
dispatcher.add_handler(EVAL_HANDLER)
|
||||
dispatcher.add_handler(EXEC_HANDLER)
|
||||
dispatcher.add_handler(CLEAR_HANDLER)
|
||||
dispatcher.add_handler(EXECHELP_HANDLER)
|
||||
|
@ -1,13 +1,16 @@
|
||||
from threading import Thread
|
||||
from telegram import InlineKeyboardMarkup
|
||||
from telegram.ext import CommandHandler, CallbackQueryHandler
|
||||
from time import time
|
||||
from os import remove
|
||||
|
||||
from bot import LOGGER, dispatcher
|
||||
from bot.helper.mirror_utils.upload_utils.gdriveTools import GoogleDriveHelper
|
||||
from bot.helper.telegram_helper.message_utils import sendMessage, editMessage, sendMarkup
|
||||
from bot.helper.telegram_helper.message_utils import sendMessage, editMessage, sendMarkup, sendFile, deleteMessage
|
||||
from bot.helper.telegram_helper.filters import CustomFilters
|
||||
from bot.helper.telegram_helper.bot_commands import BotCommands
|
||||
from bot.helper.telegram_helper import button_build
|
||||
from bot.helper.ext_utils.html_helper import hmtl_content
|
||||
|
||||
def list_buttons(update, context):
|
||||
user_id = update.message.from_user.id
|
||||
@ -36,14 +39,19 @@ def select_type(update, context):
|
||||
query.answer()
|
||||
item_type = data[2]
|
||||
editMessage(f"<b>Searching for <i>{key}</i></b>", msg)
|
||||
Thread(target=_list_drive, args=(key, msg, item_type)).start()
|
||||
Thread(target=_list_drive, args=(context.bot, key, msg, item_type)).start()
|
||||
|
||||
def _list_drive(key, bmsg, item_type):
|
||||
def _list_drive(bot, key, bmsg, item_type):
|
||||
LOGGER.info(f"listing: {key}")
|
||||
gdrive = GoogleDriveHelper()
|
||||
msg, button = gdrive.drive_list(key, isRecursive=True, itemType=item_type)
|
||||
if button:
|
||||
editMessage(msg, bmsg, button)
|
||||
rmsg, msg = gdrive.drive_list(key, isRecursive=True, itemType=item_type)
|
||||
if msg:
|
||||
name = f'{key}_{time()}.html'
|
||||
with open(name, 'w', encoding='utf-8') as f:
|
||||
f.write(hmtl_content.replace('{fileName}', key).replace('{msg}', msg))
|
||||
deleteMessage(bot, bmsg)
|
||||
sendFile(bot, bmsg.reply_to_message, name, rmsg)
|
||||
remove(name)
|
||||
else:
|
||||
editMessage(f'No result found for <i>{key}</i>', bmsg)
|
||||
|
||||
|
317
bot/modules/listener.py
Normal file
317
bot/modules/listener.py
Normal file
@ -0,0 +1,317 @@
|
||||
from requests import utils as rutils
|
||||
from re import search as re_search
|
||||
from time import sleep
|
||||
from os import path as ospath, remove as osremove, listdir, walk
|
||||
from subprocess import Popen
|
||||
from html import escape
|
||||
from telegram import InlineKeyboardMarkup
|
||||
|
||||
from bot import Interval, INDEX_URL, VIEW_LINK, aria2, DOWNLOAD_DIR, download_dict, download_dict_lock, \
|
||||
LEECH_SPLIT_SIZE, LOGGER, DB_URI, INCOMPLETE_TASK_NOTIFIER, MAX_SPLIT_SIZE
|
||||
from bot.helper.ext_utils.fs_utils import get_base_name, get_path_size, split_file, clean_download, clean_target
|
||||
from bot.helper.ext_utils.exceptions import NotSupportedExtractionArchive
|
||||
from bot.helper.mirror_utils.status_utils.extract_status import ExtractStatus
|
||||
from bot.helper.mirror_utils.status_utils.zip_status import ZipStatus
|
||||
from bot.helper.mirror_utils.status_utils.split_status import SplitStatus
|
||||
from bot.helper.mirror_utils.status_utils.upload_status import UploadStatus
|
||||
from bot.helper.mirror_utils.status_utils.tg_upload_status import TgUploadStatus
|
||||
from bot.helper.mirror_utils.upload_utils.gdriveTools import GoogleDriveHelper
|
||||
from bot.helper.mirror_utils.upload_utils.pyrogramEngine import TgUploader
|
||||
from bot.helper.telegram_helper.message_utils import sendMessage, sendMarkup, delete_all_messages, update_all_messages
|
||||
from bot.helper.telegram_helper.button_build import ButtonMaker
|
||||
from bot.helper.ext_utils.db_handler import DbManger
|
||||
|
||||
|
||||
class MirrorLeechListener:
|
||||
def __init__(self, bot, message, isZip=False, extract=False, isQbit=False, isLeech=False, pswd=None, tag=None, select=False, seed=False):
|
||||
self.bot = bot
|
||||
self.message = message
|
||||
self.uid = self.message.message_id
|
||||
self.extract = extract
|
||||
self.isZip = isZip
|
||||
self.isQbit = isQbit
|
||||
self.isLeech = isLeech
|
||||
self.pswd = pswd
|
||||
self.tag = tag
|
||||
self.seed = seed
|
||||
self.newDir = ""
|
||||
self.dir = f"{DOWNLOAD_DIR}{self.uid}"
|
||||
self.select = select
|
||||
self.isPrivate = self.message.chat.type in ['private', 'group']
|
||||
self.suproc = None
|
||||
|
||||
def clean(self):
|
||||
try:
|
||||
Interval[0].cancel()
|
||||
Interval.clear()
|
||||
aria2.purge()
|
||||
delete_all_messages()
|
||||
except:
|
||||
pass
|
||||
|
||||
def onDownloadStart(self):
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().add_incomplete_task(self.message.chat.id, self.message.link, self.tag)
|
||||
|
||||
def onDownloadComplete(self):
|
||||
with download_dict_lock:
|
||||
download = download_dict[self.uid]
|
||||
name = str(download.name()).replace('/', '')
|
||||
gid = download.gid()
|
||||
LOGGER.info(f"Download completed: {name}")
|
||||
if name == "None" or self.isQbit or not ospath.exists(f"{self.dir}/{name}"):
|
||||
name = listdir(f"{self.dir}")[-1]
|
||||
m_path = f'{self.dir}/{name}'
|
||||
size = get_path_size(m_path)
|
||||
if self.isZip:
|
||||
if self.seed and self.isLeech:
|
||||
self.newDir = f"{self.dir}10000"
|
||||
path = f"{self.newDir}/{name}.zip"
|
||||
else:
|
||||
path = f"{m_path}.zip"
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = ZipStatus(name, size, gid, self)
|
||||
if self.pswd is not None:
|
||||
if self.isLeech and int(size) > LEECH_SPLIT_SIZE:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}.0*')
|
||||
self.suproc = Popen(["7z", f"-v{LEECH_SPLIT_SIZE}b", "a", "-mx=0", f"-p{self.pswd}", path, m_path])
|
||||
else:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}')
|
||||
self.suproc = Popen(["7z", "a", "-mx=0", f"-p{self.pswd}", path, m_path])
|
||||
elif self.isLeech and int(size) > LEECH_SPLIT_SIZE:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}.0*')
|
||||
self.suproc = Popen(["7z", f"-v{LEECH_SPLIT_SIZE}b", "a", "-mx=0", path, m_path])
|
||||
else:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}')
|
||||
self.suproc = Popen(["7z", "a", "-mx=0", path, m_path])
|
||||
self.suproc.wait()
|
||||
if self.suproc.returncode == -9:
|
||||
return
|
||||
elif not self.seed:
|
||||
clean_target(m_path)
|
||||
elif self.extract:
|
||||
try:
|
||||
if ospath.isfile(m_path):
|
||||
path = get_base_name(m_path)
|
||||
LOGGER.info(f"Extracting: {name}")
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = ExtractStatus(name, size, gid, self)
|
||||
if ospath.isdir(m_path):
|
||||
if self.seed:
|
||||
self.newDir = f"{self.dir}10000"
|
||||
path = f"{self.newDir}/{name}"
|
||||
for dirpath, subdir, files in walk(m_path, topdown=False):
|
||||
for file_ in files:
|
||||
if re_search(r'\.part0*1\.rar$|\.7z\.0*1$|\.zip\.0*1$|\.zip$|\.7z$|^.(?!.*\.part\d+\.rar)(?=.*\.rar$)', file_):
|
||||
m_path = ospath.join(dirpath, file_)
|
||||
if self.seed:
|
||||
t_path = dirpath.replace(self.dir, self.newDir)
|
||||
else:
|
||||
t_path = dirpath
|
||||
if self.pswd is not None:
|
||||
self.suproc = Popen(["7z", "x", f"-p{self.pswd}", m_path, f"-o{t_path}", "-aot"])
|
||||
else:
|
||||
self.suproc = Popen(["7z", "x", m_path, f"-o{t_path}", "-aot"])
|
||||
self.suproc.wait()
|
||||
if self.suproc.returncode == -9:
|
||||
return
|
||||
elif self.suproc.returncode != 0:
|
||||
LOGGER.error('Unable to extract archive splits!')
|
||||
if not self.seed and self.suproc is not None and self.suproc.returncode == 0:
|
||||
for file_ in files:
|
||||
if re_search(r'\.r\d+$|\.7z\.\d+$|\.z\d+$|\.zip\.\d+$|\.zip$|\.rar$|\.7z$', file_):
|
||||
del_path = ospath.join(dirpath, file_)
|
||||
try:
|
||||
osremove(del_path)
|
||||
except:
|
||||
return
|
||||
else:
|
||||
if self.seed and self.isLeech:
|
||||
self.newDir = f"{self.dir}10000"
|
||||
path = path.replace(self.dir, self.newDir)
|
||||
if self.pswd is not None:
|
||||
self.suproc = Popen(["7z", "x", f"-p{self.pswd}", m_path, f"-o{path}", "-aot"])
|
||||
else:
|
||||
self.suproc = Popen(["7z", "x", m_path, f"-o{path}", "-aot"])
|
||||
self.suproc.wait()
|
||||
if self.suproc.returncode == -9:
|
||||
return
|
||||
elif self.suproc.returncode == 0:
|
||||
LOGGER.info(f"Extracted Path: {path}")
|
||||
if not self.seed:
|
||||
try:
|
||||
osremove(m_path)
|
||||
except:
|
||||
return
|
||||
else:
|
||||
LOGGER.error('Unable to extract archive! Uploading anyway')
|
||||
self.newDir = ""
|
||||
path = m_path
|
||||
except NotSupportedExtractionArchive:
|
||||
LOGGER.info("Not any valid archive, uploading file as it is.")
|
||||
self.newDir = ""
|
||||
path = m_path
|
||||
else:
|
||||
path = m_path
|
||||
up_dir, up_name = path.rsplit('/', 1)
|
||||
size = get_path_size(up_dir)
|
||||
if self.isLeech:
|
||||
m_size = []
|
||||
o_files = []
|
||||
if not self.isZip:
|
||||
checked = False
|
||||
for dirpath, subdir, files in walk(up_dir, topdown=False):
|
||||
for file_ in files:
|
||||
f_path = ospath.join(dirpath, file_)
|
||||
f_size = ospath.getsize(f_path)
|
||||
if f_size > LEECH_SPLIT_SIZE:
|
||||
if not checked:
|
||||
checked = True
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = SplitStatus(up_name, size, gid, self)
|
||||
LOGGER.info(f"Splitting: {up_name}")
|
||||
res = split_file(f_path, f_size, file_, dirpath, LEECH_SPLIT_SIZE, self)
|
||||
if not res:
|
||||
return
|
||||
if res == "errored":
|
||||
if f_size <= MAX_SPLIT_SIZE:
|
||||
continue
|
||||
else:
|
||||
try:
|
||||
osremove(f_path)
|
||||
except:
|
||||
return
|
||||
elif not self.seed or self.newDir:
|
||||
try:
|
||||
osremove(f_path)
|
||||
except:
|
||||
return
|
||||
elif self.seed and res != "errored":
|
||||
m_size.append(f_size)
|
||||
o_files.append(file_)
|
||||
|
||||
size = get_path_size(up_dir)
|
||||
for s in m_size:
|
||||
size = size - s
|
||||
LOGGER.info(f"Leech Name: {up_name}")
|
||||
tg = TgUploader(up_name, up_dir, size, self)
|
||||
tg_upload_status = TgUploadStatus(tg, size, gid, self)
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = tg_upload_status
|
||||
update_all_messages()
|
||||
tg.upload(o_files)
|
||||
else:
|
||||
up_path = f'{up_dir}/{up_name}'
|
||||
size = get_path_size(up_path)
|
||||
LOGGER.info(f"Upload Name: {up_name}")
|
||||
drive = GoogleDriveHelper(up_name, up_dir, size, self)
|
||||
upload_status = UploadStatus(drive, size, gid, self)
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = upload_status
|
||||
update_all_messages()
|
||||
drive.upload(up_name)
|
||||
|
||||
def onUploadComplete(self, link: str, size, files, folders, typ, name):
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().rm_complete_task(self.message.link)
|
||||
msg = f"<b>Name: </b><code>{escape(name)}</code>\n\n<b>Size: </b>{size}"
|
||||
if self.isLeech:
|
||||
msg += f'\n<b>Total Files: </b>{folders}'
|
||||
if typ != 0:
|
||||
msg += f'\n<b>Corrupted Files: </b>{typ}'
|
||||
msg += f'\n<b>cc: </b>{self.tag}\n\n'
|
||||
if not files:
|
||||
sendMessage(msg, self.bot, self.message)
|
||||
else:
|
||||
fmsg = ''
|
||||
for index, (link, name) in enumerate(files.items(), start=1):
|
||||
fmsg += f"{index}. <a href='{link}'>{name}</a>\n"
|
||||
if len(fmsg.encode() + msg.encode()) > 4000:
|
||||
sendMessage(msg + fmsg, self.bot, self.message)
|
||||
sleep(1)
|
||||
fmsg = ''
|
||||
if fmsg != '':
|
||||
sendMessage(msg + fmsg, self.bot, self.message)
|
||||
if self.seed:
|
||||
if self.newDir:
|
||||
clean_target(self.newDir)
|
||||
return
|
||||
else:
|
||||
msg += f'\n\n<b>Type: </b>{typ}'
|
||||
if typ == "Folder":
|
||||
msg += f'\n<b>SubFolders: </b>{folders}'
|
||||
msg += f'\n<b>Files: </b>{files}'
|
||||
msg += f'\n\n<b>cc: </b>{self.tag}'
|
||||
buttons = ButtonMaker()
|
||||
buttons.buildbutton("☁️ Drive Link", link)
|
||||
LOGGER.info(f'Done Uploading {name}')
|
||||
if INDEX_URL is not None:
|
||||
url_path = rutils.quote(f'{name}')
|
||||
share_url = f'{INDEX_URL}/{url_path}'
|
||||
if typ == "Folder":
|
||||
share_url += '/'
|
||||
buttons.buildbutton("⚡ Index Link", share_url)
|
||||
else:
|
||||
buttons.buildbutton("⚡ Index Link", share_url)
|
||||
if VIEW_LINK:
|
||||
share_urls = f'{INDEX_URL}/{url_path}?a=view'
|
||||
buttons.buildbutton("🌐 View Link", share_urls)
|
||||
sendMarkup(msg, self.bot, self.message, InlineKeyboardMarkup(buttons.build_menu(2)))
|
||||
if self.seed:
|
||||
if self.isZip:
|
||||
clean_target(f"{self.dir}/{name}")
|
||||
elif self.newDir:
|
||||
clean_target(self.newDir)
|
||||
return
|
||||
clean_download(self.dir)
|
||||
with download_dict_lock:
|
||||
try:
|
||||
del download_dict[self.uid]
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
count = len(download_dict)
|
||||
if count == 0:
|
||||
self.clean()
|
||||
else:
|
||||
update_all_messages()
|
||||
|
||||
def onDownloadError(self, error):
|
||||
error = error.replace('<', ' ').replace('>', ' ')
|
||||
clean_download(self.dir)
|
||||
if self.newDir:
|
||||
clean_download(self.newDir)
|
||||
with download_dict_lock:
|
||||
try:
|
||||
del download_dict[self.uid]
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
count = len(download_dict)
|
||||
msg = f"{self.tag} your download has been stopped due to: {error}"
|
||||
sendMessage(msg, self.bot, self.message)
|
||||
if count == 0:
|
||||
self.clean()
|
||||
else:
|
||||
update_all_messages()
|
||||
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().rm_complete_task(self.message.link)
|
||||
|
||||
def onUploadError(self, error):
|
||||
e_str = error.replace('<', '').replace('>', '')
|
||||
clean_download(self.dir)
|
||||
if self.newDir:
|
||||
clean_download(self.newDir)
|
||||
with download_dict_lock:
|
||||
try:
|
||||
del download_dict[self.uid]
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
count = len(download_dict)
|
||||
sendMessage(f"{self.tag} {e_str}", self.bot, self.message)
|
||||
if count == 0:
|
||||
self.clean()
|
||||
else:
|
||||
update_all_messages()
|
||||
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().rm_complete_task(self.message.link)
|
@ -1,546 +0,0 @@
|
||||
from base64 import b64encode
|
||||
from requests import utils as rutils, get as rget
|
||||
from re import match as re_match, search as re_search, split as re_split
|
||||
from time import sleep, time
|
||||
from os import path as ospath, remove as osremove, listdir, walk
|
||||
from shutil import rmtree
|
||||
from threading import Thread
|
||||
from subprocess import Popen
|
||||
from html import escape
|
||||
from telegram.ext import CommandHandler
|
||||
from telegram import InlineKeyboardMarkup
|
||||
|
||||
from bot import Interval, INDEX_URL, VIEW_LINK, aria2, QB_SEED, dispatcher, DOWNLOAD_DIR, \
|
||||
download_dict, download_dict_lock, LEECH_SPLIT_SIZE, LOGGER, DB_URI, INCOMPLETE_TASK_NOTIFIER
|
||||
from bot.helper.ext_utils.bot_utils import is_url, is_magnet, is_mega_link, is_gdrive_link, get_content_type
|
||||
from bot.helper.ext_utils.fs_utils import get_base_name, get_path_size, split_file, clean_download
|
||||
from bot.helper.ext_utils.exceptions import DirectDownloadLinkException, NotSupportedExtractionArchive
|
||||
from bot.helper.mirror_utils.download_utils.aria2_download import add_aria2c_download
|
||||
from bot.helper.mirror_utils.download_utils.gd_downloader import add_gd_download
|
||||
from bot.helper.mirror_utils.download_utils.qbit_downloader import QbDownloader
|
||||
from bot.helper.mirror_utils.download_utils.mega_downloader import add_mega_download
|
||||
from bot.helper.mirror_utils.download_utils.direct_link_generator import direct_link_generator
|
||||
from bot.helper.mirror_utils.download_utils.telegram_downloader import TelegramDownloadHelper
|
||||
from bot.helper.mirror_utils.status_utils.extract_status import ExtractStatus
|
||||
from bot.helper.mirror_utils.status_utils.zip_status import ZipStatus
|
||||
from bot.helper.mirror_utils.status_utils.split_status import SplitStatus
|
||||
from bot.helper.mirror_utils.status_utils.upload_status import UploadStatus
|
||||
from bot.helper.mirror_utils.status_utils.tg_upload_status import TgUploadStatus
|
||||
from bot.helper.mirror_utils.upload_utils.gdriveTools import GoogleDriveHelper
|
||||
from bot.helper.mirror_utils.upload_utils.pyrogramEngine import TgUploader
|
||||
from bot.helper.telegram_helper.bot_commands import BotCommands
|
||||
from bot.helper.telegram_helper.filters import CustomFilters
|
||||
from bot.helper.telegram_helper.message_utils import sendMessage, sendMarkup, delete_all_messages, update_all_messages
|
||||
from bot.helper.telegram_helper.button_build import ButtonMaker
|
||||
from bot.helper.ext_utils.db_handler import DbManger
|
||||
|
||||
|
||||
class MirrorListener:
|
||||
def __init__(self, bot, message, isZip=False, extract=False, isQbit=False, isLeech=False, pswd=None, tag=None, select=False, seed=False):
|
||||
self.bot = bot
|
||||
self.message = message
|
||||
self.uid = self.message.message_id
|
||||
self.extract = extract
|
||||
self.isZip = isZip
|
||||
self.isQbit = isQbit
|
||||
self.isLeech = isLeech
|
||||
self.pswd = pswd
|
||||
self.tag = tag
|
||||
self.seed = any([seed, QB_SEED])
|
||||
self.select = select
|
||||
self.isPrivate = self.message.chat.type in ['private', 'group']
|
||||
self.suproc = None
|
||||
|
||||
def clean(self):
|
||||
try:
|
||||
Interval[0].cancel()
|
||||
Interval.clear()
|
||||
aria2.purge()
|
||||
delete_all_messages()
|
||||
except:
|
||||
pass
|
||||
|
||||
def onDownloadStart(self):
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().add_incomplete_task(self.message.chat.id, self.message.link, self.tag)
|
||||
|
||||
def onDownloadComplete(self):
|
||||
with download_dict_lock:
|
||||
LOGGER.info(f"Download completed: {download_dict[self.uid].name()}")
|
||||
download = download_dict[self.uid]
|
||||
name = str(download.name()).replace('/', '')
|
||||
gid = download.gid()
|
||||
if name == "None" or self.isQbit or not ospath.exists(f'{DOWNLOAD_DIR}{self.uid}/{name}'):
|
||||
name = listdir(f'{DOWNLOAD_DIR}{self.uid}')[-1]
|
||||
m_path = f'{DOWNLOAD_DIR}{self.uid}/{name}'
|
||||
size = get_path_size(m_path)
|
||||
if self.isZip:
|
||||
path = m_path + ".zip"
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = ZipStatus(name, size, gid, self)
|
||||
if self.pswd is not None:
|
||||
if self.isLeech and int(size) > LEECH_SPLIT_SIZE:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}.0*')
|
||||
self.suproc = Popen(["7z", f"-v{LEECH_SPLIT_SIZE}b", "a", "-mx=0", f"-p{self.pswd}", path, m_path])
|
||||
else:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}')
|
||||
self.suproc = Popen(["7z", "a", "-mx=0", f"-p{self.pswd}", path, m_path])
|
||||
elif self.isLeech and int(size) > LEECH_SPLIT_SIZE:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}.0*')
|
||||
self.suproc = Popen(["7z", f"-v{LEECH_SPLIT_SIZE}b", "a", "-mx=0", path, m_path])
|
||||
else:
|
||||
LOGGER.info(f'Zip: orig_path: {m_path}, zip_path: {path}')
|
||||
self.suproc = Popen(["7z", "a", "-mx=0", path, m_path])
|
||||
self.suproc.wait()
|
||||
if self.suproc.returncode == -9:
|
||||
return
|
||||
elif self.suproc.returncode != 0:
|
||||
LOGGER.error('An error occurred while zipping! Uploading anyway')
|
||||
path = f'{DOWNLOAD_DIR}{self.uid}/{name}'
|
||||
if self.suproc.returncode == 0 and (not self.isQbit or not self.seed or self.isLeech):
|
||||
try:
|
||||
rmtree(m_path)
|
||||
except:
|
||||
osremove(m_path)
|
||||
elif self.extract:
|
||||
try:
|
||||
if ospath.isfile(m_path):
|
||||
path = get_base_name(m_path)
|
||||
LOGGER.info(f"Extracting: {name}")
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = ExtractStatus(name, size, gid, self)
|
||||
if ospath.isdir(m_path):
|
||||
for dirpath, subdir, files in walk(m_path, topdown=False):
|
||||
for file_ in files:
|
||||
if file_.endswith((".zip", ".7z")) or re_search(r'\.part0*1\.rar$|\.7z\.0*1$|\.zip\.0*1$', file_) \
|
||||
or (file_.endswith(".rar") and not re_search(r'\.part\d+\.rar$', file_)):
|
||||
m_path = ospath.join(dirpath, file_)
|
||||
if self.pswd is not None:
|
||||
self.suproc = Popen(["7z", "x", f"-p{self.pswd}", m_path, f"-o{dirpath}", "-aot"])
|
||||
else:
|
||||
self.suproc = Popen(["7z", "x", m_path, f"-o{dirpath}", "-aot"])
|
||||
self.suproc.wait()
|
||||
if self.suproc.returncode == -9:
|
||||
return
|
||||
elif self.suproc.returncode != 0:
|
||||
LOGGER.error('Unable to extract archive splits! Uploading anyway')
|
||||
if self.suproc is not None and self.suproc.returncode == 0:
|
||||
for file_ in files:
|
||||
if file_.endswith((".rar", ".zip", ".7z")) or \
|
||||
re_search(r'\.r\d+$|\.7z\.\d+$|\.z\d+$|\.zip\.\d+$', file_):
|
||||
del_path = ospath.join(dirpath, file_)
|
||||
osremove(del_path)
|
||||
path = f'{DOWNLOAD_DIR}{self.uid}/{name}'
|
||||
else:
|
||||
if self.pswd is not None:
|
||||
self.suproc = Popen(["bash", "pextract", m_path, self.pswd])
|
||||
else:
|
||||
self.suproc = Popen(["bash", "extract", m_path])
|
||||
self.suproc.wait()
|
||||
if self.suproc.returncode == -9:
|
||||
return
|
||||
elif self.suproc.returncode == 0:
|
||||
LOGGER.info(f"Extracted Path: {path}")
|
||||
osremove(m_path)
|
||||
else:
|
||||
LOGGER.error('Unable to extract archive! Uploading anyway')
|
||||
path = f'{DOWNLOAD_DIR}{self.uid}/{name}'
|
||||
except NotSupportedExtractionArchive:
|
||||
LOGGER.info("Not any valid archive, uploading file as it is.")
|
||||
path = f'{DOWNLOAD_DIR}{self.uid}/{name}'
|
||||
else:
|
||||
path = f'{DOWNLOAD_DIR}{self.uid}/{name}'
|
||||
up_name = path.rsplit('/', 1)[-1]
|
||||
if self.isLeech and not self.isZip:
|
||||
checked = False
|
||||
for dirpath, subdir, files in walk(f'{DOWNLOAD_DIR}{self.uid}', topdown=False):
|
||||
for file_ in files:
|
||||
f_path = ospath.join(dirpath, file_)
|
||||
f_size = ospath.getsize(f_path)
|
||||
if int(f_size) > LEECH_SPLIT_SIZE:
|
||||
if not checked:
|
||||
checked = True
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = SplitStatus(up_name, size, gid, self)
|
||||
LOGGER.info(f"Splitting: {up_name}")
|
||||
res = split_file(f_path, f_size, file_, dirpath, LEECH_SPLIT_SIZE, self)
|
||||
if not res:
|
||||
return
|
||||
osremove(f_path)
|
||||
if self.isLeech:
|
||||
size = get_path_size(f'{DOWNLOAD_DIR}{self.uid}')
|
||||
LOGGER.info(f"Leech Name: {up_name}")
|
||||
tg = TgUploader(up_name, self)
|
||||
tg_upload_status = TgUploadStatus(tg, size, gid, self)
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = tg_upload_status
|
||||
update_all_messages()
|
||||
tg.upload()
|
||||
else:
|
||||
up_path = f'{DOWNLOAD_DIR}{self.uid}/{up_name}'
|
||||
size = get_path_size(up_path)
|
||||
LOGGER.info(f"Upload Name: {up_name}")
|
||||
drive = GoogleDriveHelper(up_name, self)
|
||||
upload_status = UploadStatus(drive, size, gid, self)
|
||||
with download_dict_lock:
|
||||
download_dict[self.uid] = upload_status
|
||||
update_all_messages()
|
||||
drive.upload(up_name)
|
||||
|
||||
def onDownloadError(self, error):
|
||||
error = error.replace('<', ' ').replace('>', ' ')
|
||||
clean_download(f'{DOWNLOAD_DIR}{self.uid}')
|
||||
with download_dict_lock:
|
||||
try:
|
||||
del download_dict[self.uid]
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
count = len(download_dict)
|
||||
msg = f"{self.tag} your download has been stopped due to: {error}"
|
||||
sendMessage(msg, self.bot, self.message)
|
||||
if count == 0:
|
||||
self.clean()
|
||||
else:
|
||||
update_all_messages()
|
||||
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().rm_complete_task(self.message.link)
|
||||
|
||||
def onUploadComplete(self, link: str, size, files, folders, typ, name: str):
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().rm_complete_task(self.message.link)
|
||||
msg = f"<b>Name: </b><code>{escape(name)}</code>\n\n<b>Size: </b>{size}"
|
||||
if self.isLeech:
|
||||
msg += f'\n<b>Total Files: </b>{folders}'
|
||||
if typ != 0:
|
||||
msg += f'\n<b>Corrupted Files: </b>{typ}'
|
||||
msg += f'\n<b>cc: </b>{self.tag}\n\n'
|
||||
if not files:
|
||||
sendMessage(msg, self.bot, self.message)
|
||||
else:
|
||||
fmsg = ''
|
||||
for index, (link, name) in enumerate(files.items(), start=1):
|
||||
fmsg += f"{index}. <a href='{link}'>{name}</a>\n"
|
||||
if len(fmsg.encode() + msg.encode()) > 4000:
|
||||
sendMessage(msg + fmsg, self.bot, self.message)
|
||||
sleep(1)
|
||||
fmsg = ''
|
||||
if fmsg != '':
|
||||
sendMessage(msg + fmsg, self.bot, self.message)
|
||||
else:
|
||||
msg += f'\n\n<b>Type: </b>{typ}'
|
||||
if ospath.isdir(f'{DOWNLOAD_DIR}{self.uid}/{name}'):
|
||||
msg += f'\n<b>SubFolders: </b>{folders}'
|
||||
msg += f'\n<b>Files: </b>{files}'
|
||||
msg += f'\n\n<b>cc: </b>{self.tag}'
|
||||
buttons = ButtonMaker()
|
||||
buttons.buildbutton("☁️ Drive Link", link)
|
||||
LOGGER.info(f'Done Uploading {name}')
|
||||
if INDEX_URL is not None:
|
||||
url_path = rutils.quote(f'{name}')
|
||||
share_url = f'{INDEX_URL}/{url_path}'
|
||||
if ospath.isdir(f'{DOWNLOAD_DIR}/{self.uid}/{name}'):
|
||||
share_url += '/'
|
||||
buttons.buildbutton("⚡ Index Link", share_url)
|
||||
else:
|
||||
buttons.buildbutton("⚡ Index Link", share_url)
|
||||
if VIEW_LINK:
|
||||
share_urls = f'{INDEX_URL}/{url_path}?a=view'
|
||||
buttons.buildbutton("🌐 View Link", share_urls)
|
||||
sendMarkup(msg, self.bot, self.message, InlineKeyboardMarkup(buttons.build_menu(2)))
|
||||
if self.isQbit and self.seed and not self.extract:
|
||||
if self.isZip:
|
||||
try:
|
||||
osremove(f'{DOWNLOAD_DIR}{self.uid}/{name}')
|
||||
except:
|
||||
pass
|
||||
return
|
||||
clean_download(f'{DOWNLOAD_DIR}{self.uid}')
|
||||
with download_dict_lock:
|
||||
try:
|
||||
del download_dict[self.uid]
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
count = len(download_dict)
|
||||
if count == 0:
|
||||
self.clean()
|
||||
else:
|
||||
update_all_messages()
|
||||
|
||||
def onUploadError(self, error):
|
||||
e_str = error.replace('<', '').replace('>', '')
|
||||
clean_download(f'{DOWNLOAD_DIR}{self.uid}')
|
||||
with download_dict_lock:
|
||||
try:
|
||||
del download_dict[self.uid]
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
count = len(download_dict)
|
||||
sendMessage(f"{self.tag} {e_str}", self.bot, self.message)
|
||||
if count == 0:
|
||||
self.clean()
|
||||
else:
|
||||
update_all_messages()
|
||||
|
||||
if not self.isPrivate and INCOMPLETE_TASK_NOTIFIER and DB_URI is not None:
|
||||
DbManger().rm_complete_task(self.message.link)
|
||||
|
||||
def _mirror(bot, message, isZip=False, extract=False, isQbit=False, isLeech=False, pswd=None, multi=0, select=False, seed=False):
|
||||
mesg = message.text.split('\n')
|
||||
message_args = mesg[0].split(maxsplit=1)
|
||||
name_args = mesg[0].split('|', maxsplit=1)
|
||||
index = 1
|
||||
|
||||
if len(message_args) > 1:
|
||||
args = mesg[0].split(maxsplit=3)
|
||||
if "s" in [x.strip() for x in args]:
|
||||
select = True
|
||||
index += 1
|
||||
if "d" in [x.strip() for x in args]:
|
||||
seed = True
|
||||
index += 1
|
||||
message_args = mesg[0].split(maxsplit=index)
|
||||
if len(message_args) > index:
|
||||
link = message_args[index].strip()
|
||||
if link.isdigit():
|
||||
multi = int(link)
|
||||
link = ''
|
||||
elif link.startswith(("|", "pswd:")):
|
||||
link = ''
|
||||
else:
|
||||
link = ''
|
||||
else:
|
||||
link = ''
|
||||
|
||||
if len(name_args) > 1:
|
||||
name = name_args[1]
|
||||
name = name.split(' pswd:')[0]
|
||||
name = name.strip()
|
||||
else:
|
||||
name = ''
|
||||
|
||||
link = re_split(r"pswd:|\|", link)[0]
|
||||
link = link.strip()
|
||||
|
||||
pswd_arg = mesg[0].split(' pswd: ')
|
||||
if len(pswd_arg) > 1:
|
||||
pswd = pswd_arg[1]
|
||||
|
||||
if message.from_user.username:
|
||||
tag = f"@{message.from_user.username}"
|
||||
else:
|
||||
tag = message.from_user.mention_html(message.from_user.first_name)
|
||||
|
||||
reply_to = message.reply_to_message
|
||||
if reply_to is not None:
|
||||
file = None
|
||||
media_array = [reply_to.document, reply_to.video, reply_to.audio]
|
||||
for i in media_array:
|
||||
if i is not None:
|
||||
file = i
|
||||
break
|
||||
|
||||
if not reply_to.from_user.is_bot:
|
||||
if reply_to.from_user.username:
|
||||
tag = f"@{reply_to.from_user.username}"
|
||||
else:
|
||||
tag = reply_to.from_user.mention_html(reply_to.from_user.first_name)
|
||||
|
||||
if not is_url(link) and not is_magnet(link) or len(link) == 0:
|
||||
if file is None:
|
||||
reply_text = reply_to.text.split(maxsplit=1)[0].strip()
|
||||
if is_url(reply_text) or is_magnet(reply_text):
|
||||
link = reply_text
|
||||
elif file.mime_type != "application/x-bittorrent" and not isQbit:
|
||||
listener = MirrorListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag)
|
||||
Thread(target=TelegramDownloadHelper(listener).add_download, args=(message, f'{DOWNLOAD_DIR}{listener.uid}/', name)).start()
|
||||
if multi > 1:
|
||||
sleep(4)
|
||||
nextmsg = type('nextmsg', (object, ), {'chat_id': message.chat_id, 'message_id': message.reply_to_message.message_id + 1})
|
||||
nextmsg = sendMessage(message_args[0], bot, nextmsg)
|
||||
nextmsg.from_user.id = message.from_user.id
|
||||
multi -= 1
|
||||
sleep(4)
|
||||
Thread(target=_mirror, args=(bot, nextmsg, isZip, extract, isQbit, isLeech, pswd, multi)).start()
|
||||
return
|
||||
else:
|
||||
link = file.get_file().file_path
|
||||
|
||||
if not is_url(link) and not is_magnet(link) and not ospath.exists(link):
|
||||
help_msg = "<b>Send link along with command line:</b>"
|
||||
if isQbit:
|
||||
help_msg += "\n\n<b>Bittorrent selection:</b>"
|
||||
help_msg += "\n<code>/cmd</code> <b>s</b> {link} or by replying to {file/link}"
|
||||
help_msg += "\n\n<b>Qbittorrent seed</b>:"
|
||||
help_msg += "\n<code>/qbcmd</code> <b>d</b> {link} or by replying to {file/link}. "
|
||||
help_msg += "Sure you can use seed and select perfix together with qbcmds."
|
||||
help_msg += "\n\n<b>Multi links only by replying to first link/file:</b>"
|
||||
help_msg += "\n<code>/command</code> 10(number of links/files)"
|
||||
else:
|
||||
help_msg += "\n<code>/cmd</code> {link} |newname pswd: xx [zip/unzip]"
|
||||
help_msg += "\n\n<b>By replying to link/file:</b>"
|
||||
help_msg += "\n<code>/cmd</code> |newname pswd: xx [zip/unzip]"
|
||||
help_msg += "\n\n<b>Direct link authorization:</b>"
|
||||
help_msg += "\n<code>/cmd</code> {link} |newname pswd: xx\nusername\npassword"
|
||||
help_msg += "\n\n<b>Bittorrent selection:</b>"
|
||||
help_msg += "\n<code>/cmd</code> <b>s</b> {link} or by replying to {file/link}"
|
||||
help_msg += "\n\n<b>Multi links only by replying to first link/file:</b>"
|
||||
help_msg += "\n<code>/command</code> 10(number of links/files)"
|
||||
return sendMessage(help_msg, bot, message)
|
||||
|
||||
LOGGER.info(link)
|
||||
|
||||
if not is_mega_link(link) and not isQbit and not is_magnet(link) \
|
||||
and not is_gdrive_link(link) and not link.endswith('.torrent'):
|
||||
content_type = get_content_type(link)
|
||||
if content_type is None or re_match(r'text/html|text/plain', content_type):
|
||||
try:
|
||||
link = direct_link_generator(link)
|
||||
LOGGER.info(f"Generated link: {link}")
|
||||
except DirectDownloadLinkException as e:
|
||||
LOGGER.info(str(e))
|
||||
if str(e).startswith('ERROR:'):
|
||||
return sendMessage(str(e), bot, message)
|
||||
elif isQbit and not is_magnet(link):
|
||||
if link.endswith('.torrent') or "https://api.telegram.org/file/" in link:
|
||||
content_type = None
|
||||
else:
|
||||
content_type = get_content_type(link)
|
||||
if content_type is None or re_match(r'application/x-bittorrent|application/octet-stream', content_type):
|
||||
try:
|
||||
resp = rget(link, timeout=10, headers = {'user-agent': 'Wget/1.12'})
|
||||
if resp.status_code == 200:
|
||||
file_name = str(time()).replace(".", "") + ".torrent"
|
||||
with open(file_name, "wb") as t:
|
||||
t.write(resp.content)
|
||||
link = str(file_name)
|
||||
else:
|
||||
return sendMessage(f"{tag} ERROR: link got HTTP response: {resp.status_code}", bot, message)
|
||||
except Exception as e:
|
||||
error = str(e).replace('<', ' ').replace('>', ' ')
|
||||
if error.startswith('No connection adapters were found for'):
|
||||
link = error.split("'")[1]
|
||||
else:
|
||||
LOGGER.error(str(e))
|
||||
return sendMessage(tag + " " + error, bot, message)
|
||||
else:
|
||||
msg = "Qb commands for torrents only. if you are trying to dowload torrent then report."
|
||||
return sendMessage(msg, bot, message)
|
||||
|
||||
|
||||
listener = MirrorListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag, select, seed)
|
||||
|
||||
if is_gdrive_link(link):
|
||||
if not isZip and not extract and not isLeech:
|
||||
gmsg = f"Use /{BotCommands.CloneCommand} to clone Google Drive file/folder\n\n"
|
||||
gmsg += f"Use /{BotCommands.ZipMirrorCommand} to make zip of Google Drive folder\n\n"
|
||||
gmsg += f"Use /{BotCommands.UnzipMirrorCommand} to extracts Google Drive archive folder/file"
|
||||
sendMessage(gmsg, bot, message)
|
||||
else:
|
||||
Thread(target=add_gd_download, args=(link, listener, name)).start()
|
||||
elif is_mega_link(link):
|
||||
Thread(target=add_mega_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}/', listener, name)).start()
|
||||
elif isQbit and (is_magnet(link) or ospath.exists(link)):
|
||||
Thread(target=QbDownloader(listener).add_qb_torrent, args=(link, f'{DOWNLOAD_DIR}{listener.uid}', select)).start()
|
||||
else:
|
||||
if len(mesg) > 1:
|
||||
try:
|
||||
ussr = mesg[1]
|
||||
except:
|
||||
ussr = ''
|
||||
try:
|
||||
pssw = mesg[2]
|
||||
except:
|
||||
pssw = ''
|
||||
auth = f"{ussr}:{pssw}"
|
||||
auth = "Basic " + b64encode(auth.encode()).decode('ascii')
|
||||
else:
|
||||
auth = ''
|
||||
Thread(target=add_aria2c_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}', listener, name, auth, select)).start()
|
||||
|
||||
if multi > 1:
|
||||
sleep(4)
|
||||
nextmsg = type('nextmsg', (object, ), {'chat_id': message.chat_id, 'message_id': message.reply_to_message.message_id + 1})
|
||||
msg = message_args[0]
|
||||
if len(mesg) > 2:
|
||||
msg += '\n' + mesg[1] + '\n' + mesg[2]
|
||||
nextmsg = sendMessage(msg, bot, nextmsg)
|
||||
nextmsg.from_user.id = message.from_user.id
|
||||
multi -= 1
|
||||
sleep(4)
|
||||
Thread(target=_mirror, args=(bot, nextmsg, isZip, extract, isQbit, isLeech, pswd, multi, select, seed)).start()
|
||||
|
||||
|
||||
def mirror(update, context):
|
||||
_mirror(context.bot, update.message)
|
||||
|
||||
def unzip_mirror(update, context):
|
||||
_mirror(context.bot, update.message, extract=True)
|
||||
|
||||
def zip_mirror(update, context):
|
||||
_mirror(context.bot, update.message, True)
|
||||
|
||||
def qb_mirror(update, context):
|
||||
_mirror(context.bot, update.message, isQbit=True)
|
||||
|
||||
def qb_unzip_mirror(update, context):
|
||||
_mirror(context.bot, update.message, extract=True, isQbit=True)
|
||||
|
||||
def qb_zip_mirror(update, context):
|
||||
_mirror(context.bot, update.message, True, isQbit=True)
|
||||
|
||||
def leech(update, context):
|
||||
_mirror(context.bot, update.message, isLeech=True)
|
||||
|
||||
def unzip_leech(update, context):
|
||||
_mirror(context.bot, update.message, extract=True, isLeech=True)
|
||||
|
||||
def zip_leech(update, context):
|
||||
_mirror(context.bot, update.message, True, isLeech=True)
|
||||
|
||||
def qb_leech(update, context):
|
||||
_mirror(context.bot, update.message, isQbit=True, isLeech=True)
|
||||
|
||||
def qb_unzip_leech(update, context):
|
||||
_mirror(context.bot, update.message, extract=True, isQbit=True, isLeech=True)
|
||||
|
||||
def qb_zip_leech(update, context):
|
||||
_mirror(context.bot, update.message, True, isQbit=True, isLeech=True)
|
||||
|
||||
mirror_handler = CommandHandler(BotCommands.MirrorCommand, mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
unzip_mirror_handler = CommandHandler(BotCommands.UnzipMirrorCommand, unzip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
zip_mirror_handler = CommandHandler(BotCommands.ZipMirrorCommand, zip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_mirror_handler = CommandHandler(BotCommands.QbMirrorCommand, qb_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_unzip_mirror_handler = CommandHandler(BotCommands.QbUnzipMirrorCommand, qb_unzip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_zip_mirror_handler = CommandHandler(BotCommands.QbZipMirrorCommand, qb_zip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
leech_handler = CommandHandler(BotCommands.LeechCommand, leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
unzip_leech_handler = CommandHandler(BotCommands.UnzipLeechCommand, unzip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
zip_leech_handler = CommandHandler(BotCommands.ZipLeechCommand, zip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_leech_handler = CommandHandler(BotCommands.QbLeechCommand, qb_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_unzip_leech_handler = CommandHandler(BotCommands.QbUnzipLeechCommand, qb_unzip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_zip_leech_handler = CommandHandler(BotCommands.QbZipLeechCommand, qb_zip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
|
||||
dispatcher.add_handler(mirror_handler)
|
||||
dispatcher.add_handler(unzip_mirror_handler)
|
||||
dispatcher.add_handler(zip_mirror_handler)
|
||||
dispatcher.add_handler(qb_mirror_handler)
|
||||
dispatcher.add_handler(qb_unzip_mirror_handler)
|
||||
dispatcher.add_handler(qb_zip_mirror_handler)
|
||||
dispatcher.add_handler(leech_handler)
|
||||
dispatcher.add_handler(unzip_leech_handler)
|
||||
dispatcher.add_handler(zip_leech_handler)
|
||||
dispatcher.add_handler(qb_leech_handler)
|
||||
dispatcher.add_handler(qb_unzip_leech_handler)
|
||||
dispatcher.add_handler(qb_zip_leech_handler)
|
292
bot/modules/mirror_leech.py
Normal file
292
bot/modules/mirror_leech.py
Normal file
@ -0,0 +1,292 @@
|
||||
from base64 import b64encode
|
||||
from re import match as re_match, split as re_split
|
||||
from time import sleep
|
||||
from os import path as ospath
|
||||
from threading import Thread
|
||||
from telegram.ext import CommandHandler
|
||||
|
||||
from bot import dispatcher, DOWNLOAD_DIR, LOGGER
|
||||
from bot.helper.ext_utils.bot_utils import is_url, is_magnet, is_mega_link, is_gdrive_link, get_content_type
|
||||
from bot.helper.ext_utils.exceptions import DirectDownloadLinkException
|
||||
from bot.helper.mirror_utils.download_utils.aria2_download import add_aria2c_download
|
||||
from bot.helper.mirror_utils.download_utils.gd_downloader import add_gd_download
|
||||
from bot.helper.mirror_utils.download_utils.qbit_downloader import QbDownloader
|
||||
from bot.helper.mirror_utils.download_utils.mega_downloader import add_mega_download
|
||||
from bot.helper.mirror_utils.download_utils.direct_link_generator import direct_link_generator
|
||||
from bot.helper.mirror_utils.download_utils.telegram_downloader import TelegramDownloadHelper
|
||||
from bot.helper.telegram_helper.bot_commands import BotCommands
|
||||
from bot.helper.telegram_helper.filters import CustomFilters
|
||||
from bot.helper.telegram_helper.message_utils import sendMessage
|
||||
from .listener import MirrorLeechListener
|
||||
|
||||
|
||||
def _mirror_leech(bot, message, isZip=False, extract=False, isQbit=False, isLeech=False, multi=0):
|
||||
mesg = message.text.split('\n')
|
||||
message_args = mesg[0].split(maxsplit=1)
|
||||
name_args = mesg[0].split('|', maxsplit=1)
|
||||
index = 1
|
||||
ratio = None
|
||||
seed_time = None
|
||||
select = False
|
||||
seed = False
|
||||
|
||||
if len(message_args) > 1:
|
||||
args = mesg[0].split(maxsplit=3)
|
||||
for x in args:
|
||||
x = x.strip()
|
||||
if x == 's':
|
||||
select = True
|
||||
index += 1
|
||||
elif x == 'd':
|
||||
seed = True
|
||||
index += 1
|
||||
elif x.startswith('d:'):
|
||||
seed = True
|
||||
index += 1
|
||||
dargs = x.split(':')
|
||||
ratio = dargs[1] if dargs[1] else None
|
||||
if len(dargs) == 3:
|
||||
seed_time = dargs[2] if dargs[2] else None
|
||||
message_args = mesg[0].split(maxsplit=index)
|
||||
if len(message_args) > index:
|
||||
link = message_args[index].strip()
|
||||
if link.isdigit():
|
||||
if multi == 0:
|
||||
multi = int(link)
|
||||
link = ''
|
||||
elif link.startswith(("|", "pswd:")):
|
||||
link = ''
|
||||
else:
|
||||
link = ''
|
||||
else:
|
||||
link = ''
|
||||
|
||||
if len(name_args) > 1:
|
||||
name = name_args[1]
|
||||
name = name.split(' pswd:')[0]
|
||||
name = name.strip()
|
||||
else:
|
||||
name = ''
|
||||
|
||||
link = re_split(r"pswd:|\|", link)[0]
|
||||
link = link.strip()
|
||||
|
||||
pswd_arg = mesg[0].split(' pswd: ')
|
||||
if len(pswd_arg) > 1:
|
||||
pswd = pswd_arg[1]
|
||||
else:
|
||||
pswd = None
|
||||
|
||||
if message.from_user.username:
|
||||
tag = f"@{message.from_user.username}"
|
||||
else:
|
||||
tag = message.from_user.mention_html(message.from_user.first_name)
|
||||
|
||||
reply_to = message.reply_to_message
|
||||
if reply_to is not None:
|
||||
file_ = next((i for i in [reply_to.document, reply_to.video, reply_to.audio, reply_to.photo] if i), None)
|
||||
if not reply_to.from_user.is_bot:
|
||||
if reply_to.from_user.username:
|
||||
tag = f"@{reply_to.from_user.username}"
|
||||
else:
|
||||
tag = reply_to.from_user.mention_html(reply_to.from_user.first_name)
|
||||
if len(link) == 0 or not is_url(link) and not is_magnet(link):
|
||||
if file_ is None:
|
||||
reply_text = reply_to.text.split(maxsplit=1)[0].strip()
|
||||
if is_url(reply_text) or is_magnet(reply_text):
|
||||
link = reply_to.text.strip()
|
||||
elif isinstance(file_, list):
|
||||
link = file_[-1].get_file().file_path
|
||||
elif not isQbit and file_.mime_type != "application/x-bittorrent":
|
||||
listener = MirrorLeechListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag)
|
||||
Thread(target=TelegramDownloadHelper(listener).add_download, args=(message, f'{DOWNLOAD_DIR}{listener.uid}/', name)).start()
|
||||
if multi > 1:
|
||||
sleep(4)
|
||||
nextmsg = type('nextmsg', (object, ), {'chat_id': message.chat_id, 'message_id': message.reply_to_message.message_id + 1})
|
||||
nextmsg = sendMessage(message.text, bot, nextmsg)
|
||||
nextmsg.from_user.id = message.from_user.id
|
||||
multi -= 1
|
||||
sleep(4)
|
||||
Thread(target=_mirror_leech, args=(bot, nextmsg, isZip, extract, isQbit, isLeech, multi)).start()
|
||||
return
|
||||
else:
|
||||
link = file_.get_file().file_path
|
||||
|
||||
if not is_url(link) and not is_magnet(link) and not ospath.exists(link):
|
||||
help_msg = "<b>Send link along with command line:</b>"
|
||||
if isQbit:
|
||||
help_msg += "\n\n<b>Bittorrent selection:</b>"
|
||||
help_msg += "\n<code>/cmd</code> <b>s</b> {link} or by replying to {file/link}"
|
||||
help_msg += "\n\n<b>Qbittorrent seed</b>:"
|
||||
help_msg += "\n<code>/qbcmd</code> <b>d</b> {link} or by replying to {file/link}.\n"
|
||||
help_msg += "To specify ratio and seed time. Ex: d:0.7:10 (ratio and time) or d:0.7 "
|
||||
help_msg += "(only ratio) or d::10 (only time) where time in minutes"
|
||||
help_msg += "\n\n<b>Multi links only by replying to first link/file:</b>"
|
||||
help_msg += "\n<code>/command</code> 10(number of links/files)"
|
||||
else:
|
||||
help_msg += "\n<code>/cmd</code> {link} |newname pswd: xx [zip/unzip]"
|
||||
help_msg += "\n\n<b>By replying to link/file:</b>"
|
||||
help_msg += "\n<code>/cmd</code> |newname pswd: xx [zip/unzip]"
|
||||
help_msg += "\n\n<b>Direct link authorization:</b>"
|
||||
help_msg += "\n<code>/cmd</code> {link} |newname pswd: xx\nusername\npassword"
|
||||
help_msg += "\n\n<b>Bittorrent selection:</b>"
|
||||
help_msg += "\n<code>/cmd</code> <b>s</b> {link} or by replying to {file/link}"
|
||||
help_msg += "\n\n<b>Bittorrent seed</b>:"
|
||||
help_msg += "\n<code>/qbcmd</code> <b>d</b> {link} or by replying to {file/link}.\n"
|
||||
help_msg += "To specify ratio and seed time. Ex: d:0.7:10 (ratio and time) or d:0.7 "
|
||||
help_msg += "(only ratio) or d::10 (only time) where time in minutes"
|
||||
help_msg += "\n\n<b>Multi links only by replying to first link/file:</b>"
|
||||
help_msg += "\n<code>/command</code> 10(number of links/files)"
|
||||
return sendMessage(help_msg, bot, message)
|
||||
|
||||
LOGGER.info(link)
|
||||
|
||||
if not is_mega_link(link) and not isQbit and not is_magnet(link) \
|
||||
and not is_gdrive_link(link) and not link.endswith('.torrent'):
|
||||
content_type = get_content_type(link)
|
||||
if content_type is None or re_match(r'text/html|text/plain', content_type):
|
||||
try:
|
||||
link = direct_link_generator(link)
|
||||
LOGGER.info(f"Generated link: {link}")
|
||||
except DirectDownloadLinkException as e:
|
||||
LOGGER.info(str(e))
|
||||
if str(e).startswith('ERROR:'):
|
||||
return sendMessage(str(e), bot, message)
|
||||
elif isQbit and not is_magnet(link):
|
||||
if link.endswith('.torrent') or "https://api.telegram.org/file/" in link:
|
||||
content_type = None
|
||||
else:
|
||||
content_type = get_content_type(link)
|
||||
if content_type is None or re_match(r'application/x-bittorrent|application/octet-stream', content_type):
|
||||
try:
|
||||
resp = rget(link, timeout=10, headers = {'user-agent': 'Wget/1.12'})
|
||||
if resp.status_code == 200:
|
||||
file_name = str(time()).replace(".", "") + ".torrent"
|
||||
with open(file_name, "wb") as t:
|
||||
t.write(resp.content)
|
||||
link = str(file_name)
|
||||
else:
|
||||
return sendMessage(f"{tag} ERROR: link got HTTP response: {resp.status_code}", bot, message)
|
||||
except Exception as e:
|
||||
error = str(e).replace('<', ' ').replace('>', ' ')
|
||||
if error.startswith('No connection adapters were found for'):
|
||||
link = error.split("'")[1]
|
||||
else:
|
||||
LOGGER.error(str(e))
|
||||
return sendMessage(tag + " " + error, bot, message)
|
||||
else:
|
||||
msg = "Qb commands for torrents only. if you are trying to dowload torrent then report."
|
||||
return sendMessage(msg, bot, message)
|
||||
|
||||
listener = MirrorLeechListener(bot, message, isZip, extract, isQbit, isLeech, pswd, tag, select, seed)
|
||||
|
||||
if is_gdrive_link(link):
|
||||
if not isZip and not extract and not isLeech:
|
||||
gmsg = f"Use /{BotCommands.CloneCommand} to clone Google Drive file/folder\n\n"
|
||||
gmsg += f"Use /{BotCommands.ZipMirrorCommand[0]} to make zip of Google Drive folder\n\n"
|
||||
gmsg += f"Use /{BotCommands.UnzipMirrorCommand[0]} to extracts Google Drive archive folder/file"
|
||||
sendMessage(gmsg, bot, message)
|
||||
else:
|
||||
Thread(target=add_gd_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}', listener, name)).start()
|
||||
elif is_mega_link(link):
|
||||
Thread(target=add_mega_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}/', listener, name)).start()
|
||||
elif isQbit and (is_magnet(link) or ospath.exists(link)):
|
||||
Thread(target=QbDownloader(listener).add_qb_torrent, args=(link, f'{DOWNLOAD_DIR}{listener.uid}',
|
||||
select, ratio, seed_time)).start()
|
||||
else:
|
||||
if len(mesg) > 1:
|
||||
ussr = mesg[1]
|
||||
if len(mesg) > 2:
|
||||
pssw = mesg[2]
|
||||
else:
|
||||
pssw = ''
|
||||
auth = f"{ussr}:{pssw}"
|
||||
auth = "Basic " + b64encode(auth.encode()).decode('ascii')
|
||||
else:
|
||||
auth = ''
|
||||
Thread(target=add_aria2c_download, args=(link, f'{DOWNLOAD_DIR}{listener.uid}', listener, name,
|
||||
auth, select, ratio, seed_time)).start()
|
||||
|
||||
if multi > 1:
|
||||
sleep(4)
|
||||
nextmsg = type('nextmsg', (object, ), {'chat_id': message.chat_id, 'message_id': message.reply_to_message.message_id + 1})
|
||||
nextmsg = sendMessage(message.text, bot, nextmsg)
|
||||
nextmsg.from_user.id = message.from_user.id
|
||||
multi -= 1
|
||||
sleep(4)
|
||||
Thread(target=_mirror_leech, args=(bot, nextmsg, isZip, extract, isQbit, isLeech, multi)).start()
|
||||
|
||||
|
||||
def mirror(update, context):
|
||||
_mirror_leech(context.bot, update.message)
|
||||
|
||||
def unzip_mirror(update, context):
|
||||
_mirror_leech(context.bot, update.message, extract=True)
|
||||
|
||||
def zip_mirror(update, context):
|
||||
_mirror_leech(context.bot, update.message, True)
|
||||
|
||||
def qb_mirror(update, context):
|
||||
_mirror_leech(context.bot, update.message, isQbit=True)
|
||||
|
||||
def qb_unzip_mirror(update, context):
|
||||
_mirror_leech(context.bot, update.message, extract=True, isQbit=True)
|
||||
|
||||
def qb_zip_mirror(update, context):
|
||||
_mirror_leech(context.bot, update.message, True, isQbit=True)
|
||||
|
||||
def leech(update, context):
|
||||
_mirror_leech(context.bot, update.message, isLeech=True)
|
||||
|
||||
def unzip_leech(update, context):
|
||||
_mirror_leech(context.bot, update.message, extract=True, isLeech=True)
|
||||
|
||||
def zip_leech(update, context):
|
||||
_mirror_leech(context.bot, update.message, True, isLeech=True)
|
||||
|
||||
def qb_leech(update, context):
|
||||
_mirror_leech(context.bot, update.message, isQbit=True, isLeech=True)
|
||||
|
||||
def qb_unzip_leech(update, context):
|
||||
_mirror_leech(context.bot, update.message, extract=True, isQbit=True, isLeech=True)
|
||||
|
||||
def qb_zip_leech(update, context):
|
||||
_mirror_leech(context.bot, update.message, True, isQbit=True, isLeech=True)
|
||||
|
||||
mirror_handler = CommandHandler(BotCommands.MirrorCommand, mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
unzip_mirror_handler = CommandHandler(BotCommands.UnzipMirrorCommand, unzip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
zip_mirror_handler = CommandHandler(BotCommands.ZipMirrorCommand, zip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_mirror_handler = CommandHandler(BotCommands.QbMirrorCommand, qb_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_unzip_mirror_handler = CommandHandler(BotCommands.QbUnzipMirrorCommand, qb_unzip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_zip_mirror_handler = CommandHandler(BotCommands.QbZipMirrorCommand, qb_zip_mirror,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
leech_handler = CommandHandler(BotCommands.LeechCommand, leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
unzip_leech_handler = CommandHandler(BotCommands.UnzipLeechCommand, unzip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
zip_leech_handler = CommandHandler(BotCommands.ZipLeechCommand, zip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_leech_handler = CommandHandler(BotCommands.QbLeechCommand, qb_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_unzip_leech_handler = CommandHandler(BotCommands.QbUnzipLeechCommand, qb_unzip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
qb_zip_leech_handler = CommandHandler(BotCommands.QbZipLeechCommand, qb_zip_leech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
|
||||
dispatcher.add_handler(mirror_handler)
|
||||
dispatcher.add_handler(unzip_mirror_handler)
|
||||
dispatcher.add_handler(zip_mirror_handler)
|
||||
dispatcher.add_handler(qb_mirror_handler)
|
||||
dispatcher.add_handler(qb_unzip_mirror_handler)
|
||||
dispatcher.add_handler(qb_zip_mirror_handler)
|
||||
dispatcher.add_handler(leech_handler)
|
||||
dispatcher.add_handler(unzip_leech_handler)
|
||||
dispatcher.add_handler(zip_leech_handler)
|
||||
dispatcher.add_handler(qb_leech_handler)
|
||||
dispatcher.add_handler(qb_unzip_leech_handler)
|
||||
dispatcher.add_handler(qb_zip_leech_handler)
|
@ -1,18 +1,19 @@
|
||||
from requests import get as rget
|
||||
from time import sleep
|
||||
from time import time
|
||||
from threading import Thread
|
||||
from html import escape
|
||||
from urllib.parse import quote
|
||||
from telegram import InlineKeyboardMarkup
|
||||
from telegram.ext import CommandHandler, CallbackQueryHandler
|
||||
from os import remove
|
||||
|
||||
from bot import dispatcher, LOGGER, SEARCH_API_LINK, SEARCH_PLUGINS, get_client, SEARCH_LIMIT
|
||||
from bot.helper.ext_utils.telegraph_helper import telegraph
|
||||
from bot.helper.telegram_helper.message_utils import editMessage, sendMessage, sendMarkup
|
||||
from bot.helper.telegram_helper.message_utils import editMessage, sendMessage, sendMarkup, deleteMessage, sendFile
|
||||
from bot.helper.telegram_helper.filters import CustomFilters
|
||||
from bot.helper.telegram_helper.bot_commands import BotCommands
|
||||
from bot.helper.ext_utils.bot_utils import get_readable_file_size
|
||||
from bot.helper.telegram_helper import button_build
|
||||
from bot.helper.ext_utils.html_helper import html_template
|
||||
|
||||
if SEARCH_PLUGINS is not None:
|
||||
PLUGINS = []
|
||||
@ -44,8 +45,6 @@ SITES = {
|
||||
"all": "All"
|
||||
}
|
||||
|
||||
TELEGRAPH_LIMIT = 300
|
||||
|
||||
|
||||
def torser(update, context):
|
||||
user_id = update.message.from_user.id
|
||||
@ -109,12 +108,12 @@ def torserbut(update, context):
|
||||
editMessage(f"<b>Searching for <i>{key}</i>\nTorrent Site:- <i>{SITES.get(site)}</i></b>", message)
|
||||
else:
|
||||
editMessage(f"<b>Searching for <i>{key}</i>\nTorrent Site:- <i>{site.capitalize()}</i></b>", message)
|
||||
Thread(target=_search, args=(key, site, message, method)).start()
|
||||
Thread(target=_search, args=(context.bot, key, site, message, method)).start()
|
||||
else:
|
||||
query.answer()
|
||||
editMessage("Search has been canceled!", message)
|
||||
|
||||
def _search(key, site, message, method):
|
||||
def _search(bot, key, site, message, method):
|
||||
if method.startswith('api'):
|
||||
if method == 'apisearch':
|
||||
LOGGER.info(f"API Searching: {key} from {site}")
|
||||
@ -139,20 +138,20 @@ def _search(key, site, message, method):
|
||||
search_results = resp.json()
|
||||
if "error" in search_results.keys():
|
||||
return editMessage(f"No result found for <i>{key}</i>\nTorrent Site:- <i>{SITES.get(site)}</i>", message)
|
||||
msg = f"<b>Found {min(search_results['total'], TELEGRAPH_LIMIT)}</b>"
|
||||
cap = f"<b>Found {search_results['total']}</b>"
|
||||
if method == 'apitrend':
|
||||
msg += f" <b>trending result(s)\nTorrent Site:- <i>{SITES.get(site)}</i></b>"
|
||||
cap += f" <b>trending results\nTorrent Site:- <i>{SITES.get(site)}</i></b>"
|
||||
elif method == 'apirecent':
|
||||
msg += f" <b>recent result(s)\nTorrent Site:- <i>{SITES.get(site)}</i></b>"
|
||||
cap += f" <b>recent results\nTorrent Site:- <i>{SITES.get(site)}</i></b>"
|
||||
else:
|
||||
msg += f" <b>result(s) for <i>{key}</i>\nTorrent Site:- <i>{SITES.get(site)}</i></b>"
|
||||
cap += f" <b>results for <i>{key}</i>\nTorrent Site:- <i>{SITES.get(site)}</i></b>"
|
||||
search_results = search_results['data']
|
||||
except Exception as e:
|
||||
return editMessage(str(e), message)
|
||||
else:
|
||||
LOGGER.info(f"PLUGINS Searching: {key} from {site}")
|
||||
client = get_client()
|
||||
search = client.search_start(pattern=str(key), plugins=str(site), category='all')
|
||||
search = client.search_start(pattern=key, plugins=site, category='all')
|
||||
search_id = search.id
|
||||
while True:
|
||||
result_status = client.search_status(search_id=search_id)
|
||||
@ -164,78 +163,69 @@ def _search(key, site, message, method):
|
||||
total_results = dict_search_results.total
|
||||
if total_results == 0:
|
||||
return editMessage(f"No result found for <i>{key}</i>\nTorrent Site:- <i>{site.capitalize()}</i>", message)
|
||||
msg = f"<b>Found {min(total_results, TELEGRAPH_LIMIT)}</b>"
|
||||
msg += f" <b>result(s) for <i>{key}</i>\nTorrent Site:- <i>{site.capitalize()}</i></b>"
|
||||
link = _getResult(search_results, key, message, method)
|
||||
buttons = button_build.ButtonMaker()
|
||||
buttons.buildbutton("🔎 VIEW", link)
|
||||
button = InlineKeyboardMarkup(buttons.build_menu(1))
|
||||
editMessage(msg, message, button)
|
||||
cap = f"<b>Found {total_results}</b>"
|
||||
cap += f" <b>results for <i>{key}</i>\nTorrent Site:- <i>{site.capitalize()}</i></b>"
|
||||
hmsg = _getResult(search_results, key, message, method)
|
||||
name = f"{method}_{key}_{site}_{time()}.html"
|
||||
with open(name, "w", encoding='utf-8') as f:
|
||||
f.write(html_template.replace('{msg}', hmsg).replace('{title}', f'{method}_{key}_{site}'))
|
||||
deleteMessage(bot, message)
|
||||
sendFile(bot, message.reply_to_message, name, cap)
|
||||
remove(name)
|
||||
if not method.startswith('api'):
|
||||
client.search_delete(search_id=search_id)
|
||||
|
||||
def _getResult(search_results, key, message, method):
|
||||
telegraph_content = []
|
||||
if method == 'apirecent':
|
||||
msg = "<h4>API Recent Results</h4>"
|
||||
msg = '<span class="container center rfontsize"><h4>API Recent Results</h4></span>'
|
||||
elif method == 'apisearch':
|
||||
msg = f"<h4>API Search Result(s) For {key}</h4>"
|
||||
msg = f'<span class="container center rfontsize"><h4>API Search Results For {key}</h4></span>'
|
||||
elif method == 'apitrend':
|
||||
msg = "<h4>API Trending Results</h4>"
|
||||
msg = '<span class="container center rfontsize"><h4>API Trending Results</h4></span>'
|
||||
else:
|
||||
msg = f"<h4>PLUGINS Search Result(s) For {key}</h4>"
|
||||
for index, result in enumerate(search_results, start=1):
|
||||
msg = f'<span class="container center rfontsize"><h4>PLUGINS Search Results For {key}</h4></span>'
|
||||
for result in search_results:
|
||||
msg += '<span class="container start rfontsize">'
|
||||
if method.startswith('api'):
|
||||
if 'name' in result.keys():
|
||||
msg += f"<code><a href='{result['url']}'>{escape(result['name'])}</a></code><br>"
|
||||
msg += f"<div> <a class='withhover' href='{result['url']}'>{escape(result['name'])}</a></div>"
|
||||
if 'torrents' in result.keys():
|
||||
for subres in result['torrents']:
|
||||
msg += f"<b>Quality: </b>{subres['quality']} | <b>Type: </b>{subres['type']} | <b>Size: </b>{subres['size']}<br>"
|
||||
msg += f"<span class='topmarginsm'><b>Quality: </b>{subres['quality']} | "
|
||||
mag += f"<b>Type: </b>{subres['type']} | <b>Size: </b>{subres['size']}</span>"
|
||||
if 'torrent' in subres.keys():
|
||||
msg += f"<a href='{subres['torrent']}'>Direct Link</a><br>"
|
||||
msg += "<span class='topmarginxl'><a class='withhover' "
|
||||
msg += f"href='{subres['torrent']}'>Direct Link</a></span>"
|
||||
elif 'magnet' in subres.keys():
|
||||
msg += f"<b>Share Magnet to</b> <a href='http://t.me/share/url?url={subres['magnet']}'>Telegram</a><br>"
|
||||
msg += "<span><b>Share Magnet to</b> <a class='withhover' "
|
||||
msg += f"href='http://t.me/share/url?url={subres['magnet']}'>Telegram</a></span>"
|
||||
msg += '<br>'
|
||||
else:
|
||||
msg += f"<b>Size: </b>{result['size']}<br>"
|
||||
msg += f"<span class='topmarginsm'><b>Size: </b>{result['size']}</span>"
|
||||
try:
|
||||
msg += f"<b>Seeders: </b>{result['seeders']} | <b>Leechers: </b>{result['leechers']}<br>"
|
||||
msg += f"<span class='topmarginsm'><b>Seeders: </b>{result['seeders']} | "
|
||||
msg += f"<b>Leechers: </b>{result['leechers']}</span>"
|
||||
except:
|
||||
pass
|
||||
if 'torrent' in result.keys():
|
||||
msg += f"<a href='{result['torrent']}'>Direct Link</a><br><br>"
|
||||
msg += f"<span class='topmarginxl'><a class='withhover' "
|
||||
msg += f"href='{result['torrent']}'>Direct Link</a></span>"
|
||||
elif 'magnet' in result.keys():
|
||||
msg += f"<b>Share Magnet to</b> <a href='http://t.me/share/url?url={quote(result['magnet'])}'>Telegram</a><br><br>"
|
||||
msg += f"<span class='topmarginxl'><b>Share Magnet to</b> <a class='withhover' "
|
||||
msg += f"href='http://t.me/share/url?url={quote(result['magnet'])}'>Telegram</a></span>"
|
||||
else:
|
||||
msg += f"<a href='{result.descrLink}'>{escape(result.fileName)}</a><br>"
|
||||
msg += f"<b>Size: </b>{get_readable_file_size(result.fileSize)}<br>"
|
||||
msg += f"<b>Seeders: </b>{result.nbSeeders} | <b>Leechers: </b>{result.nbLeechers}<br>"
|
||||
msg += f"<div> <a class='withhover' href='{result.descrLink}'>{escape(result.fileName)}</a></div>"
|
||||
msg += f"<span class='topmarginsm'><b>Size: </b>{get_readable_file_size(result.fileSize)}</span>"
|
||||
msg += f"<span class='topmarginsm'><b>Seeders: </b>{result.nbSeeders} | "
|
||||
msg += f"<b>Leechers: </b>{result.nbLeechers}</span>"
|
||||
link = result.fileUrl
|
||||
if link.startswith('magnet:'):
|
||||
msg += f"<b>Share Magnet to</b> <a href='http://t.me/share/url?url={quote(link)}'>Telegram</a><br><br>"
|
||||
msg += f"<span class='topmarginxl'><b>Share Magnet to</b> <a class='withhover' "
|
||||
msg += f"href='http://t.me/share/url?url={quote(link)}'>Telegram</a></span>"
|
||||
else:
|
||||
msg += f"<a href='{link}'>Direct Link</a><br><br>"
|
||||
|
||||
if len(msg.encode('utf-8')) > 39000:
|
||||
telegraph_content.append(msg)
|
||||
msg = ""
|
||||
|
||||
if index == TELEGRAPH_LIMIT:
|
||||
break
|
||||
|
||||
if msg != "":
|
||||
telegraph_content.append(msg)
|
||||
|
||||
editMessage(f"<b>Creating</b> {len(telegraph_content)} <b>Telegraph pages.</b>", message)
|
||||
path = [telegraph.create_page(
|
||||
title='Mirror-leech-bot Torrent Search',
|
||||
content=content
|
||||
)["path"] for content in telegraph_content]
|
||||
sleep(0.5)
|
||||
if len(path) > 1:
|
||||
editMessage(f"<b>Editing</b> {len(telegraph_content)} <b>Telegraph pages.</b>", message)
|
||||
telegraph.edit_telegraph(path, telegraph_content)
|
||||
return f"https://telegra.ph/{path[0]}"
|
||||
msg += f"<span class='topmarginxl'><a class='withhover' href='{link}'>Direct Link</a></span>"
|
||||
msg += '</span>'
|
||||
return msg
|
||||
|
||||
def _api_buttons(user_id, method):
|
||||
buttons = button_build.ButtonMaker()
|
||||
|
@ -11,11 +11,11 @@ from bot.helper.ext_utils.bot_utils import get_readable_file_size, is_url
|
||||
from bot.helper.mirror_utils.download_utils.youtube_dl_download_helper import YoutubeDLHelper
|
||||
from bot.helper.telegram_helper.bot_commands import BotCommands
|
||||
from bot.helper.telegram_helper.filters import CustomFilters
|
||||
from .mirror import MirrorListener
|
||||
from .listener import MirrorLeechListener
|
||||
|
||||
listener_dict = {}
|
||||
|
||||
def _watch(bot, message, isZip=False, isLeech=False, multi=0):
|
||||
def _ytdl(bot, message, isZip=False, isLeech=False, multi=0):
|
||||
mssg = message.text
|
||||
user_id = message.from_user.id
|
||||
msg_id = message.message_id
|
||||
@ -24,7 +24,8 @@ def _watch(bot, message, isZip=False, isLeech=False, multi=0):
|
||||
if len(link) > 1:
|
||||
link = link[1].strip()
|
||||
if link.strip().isdigit():
|
||||
multi = int(link)
|
||||
if multi == 0:
|
||||
multi = int(link)
|
||||
link = ''
|
||||
elif link.strip().startswith(("|", "pswd:", "args:")):
|
||||
link = ''
|
||||
@ -82,7 +83,7 @@ def _watch(bot, message, isZip=False, isLeech=False, multi=0):
|
||||
help_msg += "\n\nCheck all arguments from this <a href='https://github.com/yt-dlp/yt-dlp/blob/master/yt_dlp/YoutubeDL.py#L174'>FILE</a>."
|
||||
return sendMessage(help_msg, bot, message)
|
||||
|
||||
listener = MirrorListener(bot, message, isZip, isLeech=isLeech, pswd=pswd, tag=tag)
|
||||
listener = MirrorLeechListener(bot, message, isZip, isLeech=isLeech, pswd=pswd, tag=tag)
|
||||
buttons = button_build.ButtonMaker()
|
||||
best_video = "bv*+ba/b"
|
||||
best_audio = "ba/b"
|
||||
@ -159,11 +160,11 @@ def _watch(bot, message, isZip=False, isLeech=False, multi=0):
|
||||
if multi > 1:
|
||||
sleep(4)
|
||||
nextmsg = type('nextmsg', (object, ), {'chat_id': message.chat_id, 'message_id': message.reply_to_message.message_id + 1})
|
||||
nextmsg = sendMessage(mssg.split(' ')[0], bot, nextmsg)
|
||||
nextmsg = sendMessage(mssg, bot, nextmsg)
|
||||
nextmsg.from_user.id = message.from_user.id
|
||||
multi -= 1
|
||||
sleep(4)
|
||||
Thread(target=_watch, args=(bot, nextmsg, isZip, isLeech, multi)).start()
|
||||
Thread(target=_ytdl, args=(bot, nextmsg, isZip, isLeech, multi)).start()
|
||||
|
||||
def _qual_subbuttons(task_id, qual, msg):
|
||||
buttons = button_build.ButtonMaker()
|
||||
@ -271,30 +272,30 @@ def _auto_cancel(msg, msg_id):
|
||||
except:
|
||||
pass
|
||||
|
||||
def watch(update, context):
|
||||
_watch(context.bot, update.message)
|
||||
def ytdl(update, context):
|
||||
_ytdl(context.bot, update.message)
|
||||
|
||||
def watchZip(update, context):
|
||||
_watch(context.bot, update.message, True)
|
||||
def ytdlZip(update, context):
|
||||
_ytdl(context.bot, update.message, True)
|
||||
|
||||
def leechWatch(update, context):
|
||||
_watch(context.bot, update.message, isLeech=True)
|
||||
def ytdlleech(update, context):
|
||||
_ytdl(context.bot, update.message, isLeech=True)
|
||||
|
||||
def leechWatchZip(update, context):
|
||||
_watch(context.bot, update.message, True, True)
|
||||
def ytdlZipleech(update, context):
|
||||
_ytdl(context.bot, update.message, True, True)
|
||||
|
||||
watch_handler = CommandHandler(BotCommands.WatchCommand, watch,
|
||||
ytdl_handler = CommandHandler(BotCommands.YtdlCommand, ytdl,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
zip_watch_handler = CommandHandler(BotCommands.ZipWatchCommand, watchZip,
|
||||
ytdl_zip_handler = CommandHandler(BotCommands.YtdlZipCommand, ytdlZip,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
leech_watch_handler = CommandHandler(BotCommands.LeechWatchCommand, leechWatch,
|
||||
ytdl_leech_handler = CommandHandler(BotCommands.YtdlLeechCommand, ytdlleech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
leech_zip_watch_handler = CommandHandler(BotCommands.LeechZipWatchCommand, leechWatchZip,
|
||||
ytdl_zip_leech_handler = CommandHandler(BotCommands.YtdlZipLeechCommand, ytdlZipleech,
|
||||
filters=CustomFilters.authorized_chat | CustomFilters.authorized_user, run_async=True)
|
||||
quality_handler = CallbackQueryHandler(select_format, pattern="qu", run_async=True)
|
||||
|
||||
dispatcher.add_handler(watch_handler)
|
||||
dispatcher.add_handler(zip_watch_handler)
|
||||
dispatcher.add_handler(leech_watch_handler)
|
||||
dispatcher.add_handler(leech_zip_watch_handler)
|
||||
dispatcher.add_handler(ytdl_handler)
|
||||
dispatcher.add_handler(ytdl_zip_handler)
|
||||
dispatcher.add_handler(ytdl_leech_handler)
|
||||
dispatcher.add_handler(ytdl_zip_leech_handler)
|
||||
dispatcher.add_handler(quality_handler)
|
199
extract
199
extract
@ -1,199 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "Usage: $(basename $0) FILES"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
extract() {
|
||||
arg="$1"
|
||||
cd "$(dirname "$arg")" || exit
|
||||
case "$arg" in
|
||||
*.tar.bz2)
|
||||
tar xjf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.tar.gz)
|
||||
tar xzf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.bz2)
|
||||
bunzip2 "$arg"
|
||||
local code=$?
|
||||
;;
|
||||
*.gz)
|
||||
gunzip "$arg"
|
||||
local code=$?
|
||||
;;
|
||||
*.tar)
|
||||
tar xf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.tbz2)
|
||||
(tar xjf "$arg" --one-top-level)
|
||||
local code=$?
|
||||
;;
|
||||
*.tgz)
|
||||
tar xzf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.tar.xz)
|
||||
a_dir=$(expr "$arg" : '\(.*\).tar.xz')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.zip)
|
||||
a_dir=$(expr "$arg" : '\(.*\).zip')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.7z)
|
||||
a_dir=$(expr "$arg" : '\(.*\).7z')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.Z)
|
||||
uncompress "$arg"
|
||||
local code=$?
|
||||
;;
|
||||
*.rar)
|
||||
a_dir=$(expr "$arg" : '\(.*\).rar')
|
||||
mkdir "$a_dir"
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.iso)
|
||||
a_dir=$(expr "$arg" : '\(.*\).iso')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.wim)
|
||||
a_dir=$(expr "$arg" : '\(.*\).wim')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.cab)
|
||||
a_dir=$(expr "$arg" : '\(.*\).cab')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.apm)
|
||||
a_dir=$(expr "$arg" : '\(.*\).apm')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.arj)
|
||||
a_dir=$(expr "$arg" : '\(.*\).arj')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.chm)
|
||||
a_dir=$(expr "$arg" : '\(.*\).chm')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.cpio)
|
||||
a_dir=$(expr "$arg" : '\(.*\).cpio')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.cramfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).cramfs')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.deb)
|
||||
a_dir=$(expr "$arg" : '\(.*\).deb')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.dmg)
|
||||
a_dir=$(expr "$arg" : '\(.*\).dmg')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.fat)
|
||||
a_dir=$(expr "$arg" : '\(.*\).fat')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.hfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).hfs')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.lzh)
|
||||
a_dir=$(expr "$arg" : '\(.*\).lzh')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.lzma)
|
||||
a_dir=$(expr "$arg" : '\(.*\).lzma')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.lzma2)
|
||||
a_dir=$(expr "$arg" : '\(.*\).lzma2')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.mbr)
|
||||
a_dir=$(expr "$arg" : '\(.*\).mbr')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.msi)
|
||||
a_dir=$(expr "$arg" : '\(.*\).msi')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.mslz)
|
||||
a_dir=$(expr "$arg" : '\(.*\).mslz')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.nsis)
|
||||
a_dir=$(expr "$arg" : '\(.*\).nsis')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.ntfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).ntfs')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.rpm)
|
||||
a_dir=$(expr "$arg" : '\(.*\).rpm')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.squashfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).squashfs')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.udf)
|
||||
a_dir=$(expr "$arg" : '\(.*\).udf')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.vhd)
|
||||
a_dir=$(expr "$arg" : '\(.*\).vhd')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*.xar)
|
||||
a_dir=$(expr "$arg" : '\(.*\).xar')
|
||||
7z x "$arg" -o"$a_dir"
|
||||
local code=$?
|
||||
;;
|
||||
*)
|
||||
echo "'$arg' cannot be extracted via extract()" 1>&2
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
cd - || exit $?
|
||||
exit $code
|
||||
}
|
||||
|
||||
extract "$1"
|
200
pextract
200
pextract
@ -1,200 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "Usage: $(basename $0) FILES"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
extract() {
|
||||
arg="$1"
|
||||
pswd="$2"
|
||||
cd "$(dirname "$arg")" || exit
|
||||
case "$arg" in
|
||||
*.tar.bz2)
|
||||
tar xjf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.tar.gz)
|
||||
tar xzf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.bz2)
|
||||
bunzip2 "$arg"
|
||||
local code=$?
|
||||
;;
|
||||
*.gz)
|
||||
gunzip "$arg"
|
||||
local code=$?
|
||||
;;
|
||||
*.tar)
|
||||
tar xf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.tbz2)
|
||||
(tar xjf "$arg" --one-top-level)
|
||||
local code=$?
|
||||
;;
|
||||
*.tgz)
|
||||
tar xzf "$arg" --one-top-level
|
||||
local code=$?
|
||||
;;
|
||||
*.tar.xz)
|
||||
a_dir=$(expr "$arg" : '\(.*\).tar.xz')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.zip)
|
||||
a_dir=$(expr "$arg" : '\(.*\).zip')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.7z)
|
||||
a_dir=$(expr "$arg" : '\(.*\).7z')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.Z)
|
||||
uncompress "$arg"
|
||||
local code=$?
|
||||
;;
|
||||
*.rar)
|
||||
a_dir=$(expr "$arg" : '\(.*\).rar')
|
||||
mkdir "$a_dir"
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.iso)
|
||||
a_dir=$(expr "$arg" : '\(.*\).iso')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.wim)
|
||||
a_dir=$(expr "$arg" : '\(.*\).wim')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.cab)
|
||||
a_dir=$(expr "$arg" : '\(.*\).cab')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.apm)
|
||||
a_dir=$(expr "$arg" : '\(.*\).apm')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.arj)
|
||||
a_dir=$(expr "$arg" : '\(.*\).arj')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.chm)
|
||||
a_dir=$(expr "$arg" : '\(.*\).chm')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.cpio)
|
||||
a_dir=$(expr "$arg" : '\(.*\).cpio')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.cramfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).cramfs')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.deb)
|
||||
a_dir=$(expr "$arg" : '\(.*\).deb')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.dmg)
|
||||
a_dir=$(expr "$arg" : '\(.*\).dmg')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.fat)
|
||||
a_dir=$(expr "$arg" : '\(.*\).fat')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.hfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).hfs')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.lzh)
|
||||
a_dir=$(expr "$arg" : '\(.*\).lzh')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.lzma)
|
||||
a_dir=$(expr "$arg" : '\(.*\).lzma')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.lzma2)
|
||||
a_dir=$(expr "$arg" : '\(.*\).lzma2')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.mbr)
|
||||
a_dir=$(expr "$arg" : '\(.*\).mbr')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.msi)
|
||||
a_dir=$(expr "$arg" : '\(.*\).msi')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.mslz)
|
||||
a_dir=$(expr "$arg" : '\(.*\).mslz')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.nsis)
|
||||
a_dir=$(expr "$arg" : '\(.*\).nsis')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.ntfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).ntfs')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.rpm)
|
||||
a_dir=$(expr "$arg" : '\(.*\).rpm')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.squashfs)
|
||||
a_dir=$(expr "$arg" : '\(.*\).squashfs')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.udf)
|
||||
a_dir=$(expr "$arg" : '\(.*\).udf')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.vhd)
|
||||
a_dir=$(expr "$arg" : '\(.*\).vhd')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*.xar)
|
||||
a_dir=$(expr "$arg" : '\(.*\).xar')
|
||||
7z x "$arg" -o"$a_dir" -p"$pswd"
|
||||
local code=$?
|
||||
;;
|
||||
*)
|
||||
echo "'$arg' cannot be extracted via extract()" 1>&2
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
cd - || exit $?
|
||||
exit $code
|
||||
}
|
||||
|
||||
extract "$1" "$2"
|
@ -4,7 +4,8 @@ Accepted=true
|
||||
[BitTorrent]
|
||||
Session\AsyncIOThreadsCount=16
|
||||
Session\MultiConnectionsPerIp=true
|
||||
Session\SlowTorrentsDownloadRate=50
|
||||
Session\SlowTorrentsDownloadRate=2
|
||||
Session\SlowTorrentsUploadRate=2
|
||||
Session\SlowTorrentsInactivityTimer=600
|
||||
Session\GlobalMaxSeedingMinutes=-1
|
||||
|
||||
|
@ -8,6 +8,8 @@ from web.nodes import make_tree
|
||||
|
||||
app = Flask(__name__)
|
||||
|
||||
aria2 = ariaAPI(ariaClient(host="http://localhost", port=6800, secret=""))
|
||||
|
||||
basicConfig(format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
|
||||
handlers=[FileHandler('log.txt'), StreamHandler()],
|
||||
level=INFO)
|
||||
@ -711,7 +713,6 @@ def list_torrent_contents(id_):
|
||||
cont = make_tree(res)
|
||||
client.auth_log_out()
|
||||
else:
|
||||
aria2 = ariaAPI(ariaClient(host="http://localhost", port=6800, secret=""))
|
||||
res = aria2.client.get_files(id_)
|
||||
cont = make_tree(res, True)
|
||||
return page.replace("{My_content}", cont[0]).replace("{form_url}", f"/app/files/{id_}?pin_code={pincode}")
|
||||
@ -764,7 +765,6 @@ def set_priority(id_):
|
||||
|
||||
resume = resume.strip(",")
|
||||
|
||||
aria2 = ariaAPI(ariaClient(host="http://localhost", port=6800, secret=""))
|
||||
res = aria2.client.change_option(id_, {'select-file': resume})
|
||||
if res == "OK":
|
||||
LOGGER.info(f"Verified! Gid: {id_}")
|
||||
|
Loading…
Reference in New Issue
Block a user