mirror of
https://github.com/anasty17/mirror-leech-telegram-bot.git
synced 2025-01-05 10:36:52 +08:00
Refactor Part 2
- Fix few errors - Change ffmpeg cmds input. Check readme! - Change config from .env to .py (build and deploy is required) Signed-off-by: anasty17 <e.anastayyar@gmail.com>
This commit is contained in:
parent
0a4b4785d0
commit
f2e3be4bf9
2
.gitignore
vendored
2
.gitignore
vendored
@ -1,5 +1,5 @@
|
||||
mltbenv/*
|
||||
config.env
|
||||
config.py
|
||||
*.pyc
|
||||
data*
|
||||
.vscode
|
||||
|
87
README.md
87
README.md
@ -105,8 +105,8 @@ programming in Python.
|
||||
- Store RSS data
|
||||
- Store incompleted task messages
|
||||
- Store JDownloader settings
|
||||
- Store config.env file on first build and incase any change occured to it, then next build it will define variables
|
||||
from config.env instead of database
|
||||
- Store config.py file on first build and incase any change occured to it, then next build it will define variables
|
||||
from config.py instead of database
|
||||
|
||||
## Torrents Search
|
||||
|
||||
@ -174,11 +174,6 @@ programming in Python.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Tutorial Video from A to Z:
|
||||
- Thanks to [Wiszky](https://github.com/vishnoe115)
|
||||
|
||||
<p><a href="https://youtu.be/IUmq1paCiHI"> <img src="https://img.shields.io/badge/See%20Video-black?style=for-the-badge&logo=YouTube" width="160""/></a></p>
|
||||
|
||||
### 1. Installing requirements
|
||||
|
||||
- Clone this repo:
|
||||
@ -212,17 +207,10 @@ pip3 install -r requirements-cli.txt
|
||||
### 2. Setting up config file
|
||||
|
||||
```
|
||||
cp config_sample.env config.env
|
||||
cp config_sample.py config.py
|
||||
```
|
||||
|
||||
- Remove the first line saying:
|
||||
|
||||
```
|
||||
_____REMOVE_THIS_LINE_____=True
|
||||
```
|
||||
|
||||
Fill up rest of the fields. Meaning of each field is discussed below. **NOTE**: All values must be filled between
|
||||
quotes, even if it's `Int`, `Bool` or `List`.
|
||||
Fill up rest of the fields. Meaning of each field is discussed below.
|
||||
|
||||
**1. Required Fields**
|
||||
|
||||
@ -243,8 +231,8 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
generate database. Data will be saved in Database: bot settings, users settings, rss data and incomplete tasks. **NOTE**: You can always edit all settings that saved in database from the official site -> (Browse collections). `Str`
|
||||
- `DOWNLOAD_DIR`: The path to the vps local folder where the downloads should be downloaded to. `Str`
|
||||
- `CMD_SUFFIX`: Commands index number. This number will added at the end all commands. `Str`|`Int`
|
||||
- `AUTHORIZED_CHATS`: Fill user_id and chat_id of groups/users you want to authorize. To auth only specific topic(s) write it in this format `chat_id|thread_id` Ex:-100XXXXXXXXXXX|10 or Ex:-100XXXXXXXXXXX|10|12. Separate them by space. `Int`
|
||||
- `SUDO_USERS`: Fill user_id of users whom you want to give sudo permission. Separate them by space. `Int`
|
||||
- `AUTHORIZED_CHATS`: Fill user_id and chat_id of groups/users you want to authorize. To auth only specific topic(s) write it in this format `chat_id|thread_id` Ex:-100XXXXXXXXXXX or -100XXXXXXXXXXX|10 or Ex:-100XXXXXXXXXXX|10|12. Separate them by space. `Str`
|
||||
- `SUDO_USERS`: Fill user_id of users whom you want to give sudo permission. Separate them by space. `Str`
|
||||
- `DEFAULT_UPLOAD`: Whether `rc` to upload to `RCLONE_PATH` or `gd` to upload to `GDRIVE_ID`. Default is `rc`. Read
|
||||
More [HERE](https://github.com/anasty17/mirror-leech-telegram-bot/tree/master#upload).`Str`
|
||||
- `STATUS_UPDATE_INTERVAL`: Time in seconds after which the progress/status message will be updated. Recommended `10`
|
||||
@ -255,40 +243,40 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
- `INCOMPLETE_TASK_NOTIFIER`: Get incomplete task messages after restart. Require database and superGroup. Default
|
||||
is `False`. `Bool`
|
||||
- `FILELION_API`: Filelion api key to mirror Filelion links. Get it
|
||||
from [Filelion](https://vidhide.com/?op=my_account). `str`
|
||||
from [Filelion](https://vidhide.com/?op=my_account). `Str`
|
||||
- `STREAMWISH_API`: Streamwish api key to mirror Streamwish links. Get it
|
||||
from [Streamwish](https://streamwish.com/?op=my_account). `str`
|
||||
from [Streamwish](https://streamwish.com/?op=my_account). `Str`
|
||||
- `YT_DLP_OPTIONS`: Default yt-dlp options. Check all possible
|
||||
options [HERE](https://github.com/yt-dlp/yt-dlp/blob/master/yt_dlp/YoutubeDL.py#L184) or use
|
||||
this [script](https://t.me/mltb_official_channel/177) to convert cli arguments to api options. Format: key:value|key:
|
||||
value|key:value. Add `^` before integer or float, some numbers must be numeric and some string. `str`
|
||||
value|key:value. Add `^` before integer or float, some numbers must be numeric and some string. `Str`
|
||||
- Example: "format:bv*+mergeall[vcodec=none]|nocheckcertificate:True"
|
||||
- `USE_SERVICE_ACCOUNTS`: Whether to use Service Accounts or not, with google-api-python-client. For this to work
|
||||
see [Using Service Accounts](https://github.com/anasty17/mirror-leech-telegram-bot#generate-service-accounts-what-is-service-account)
|
||||
section below. Default is `False`. `Bool`
|
||||
- `FFMPEG_CMDS`: list of lists of ffmpeg commands. You can set multiple ffmpeg commands for all files before upload. Don't write ffmpeg at beginning, start directly with the arguments. `list`
|
||||
- Examples: [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv"], ["-i", "mltb.video", "-c", "copy", "-c:s", "srt", "mltb"], ["-i", "mltb.m4a", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"], ["-i", "mltb.audio", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"]]
|
||||
- `FFMPEG_CMDS`: list of ffmpeg commands. You can set multiple ffmpeg commands for all files before upload. Don't write ffmpeg at beginning, start directly with the arguments. `List`
|
||||
- Examples: ["-i mltb.mkv -c copy -c:s srt mltb.mkv", "-i mltb.video -c copy -c:s srt mltb", "-i mltb.m4a -c:a libmp3lame -q:a 2 mltb.mp3", "-i mltb.audio -c:a libmp3lame -q:a 2 mltb.mp3"]
|
||||
**Notes**:
|
||||
- Add `-del` to the list(s) which you want from the bot to delete the original files after command run complete!
|
||||
- Add `-del` to the list which you want from the bot to delete the original files after command run complete!
|
||||
- Seed will get disbaled while using this option
|
||||
- It must be list of list(s) event of one list added like [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv", "-del"]]
|
||||
**Example**:
|
||||
- Here I will explain how to use mltb.* which is reference to files you want to work on.
|
||||
1. First cmd: the input is mltb.mkv so this cmd will work only on mkv videos and the output is mltb.mkv also so all outputs is mkv. `-del` will delete the original media after complete run of the cmd.
|
||||
2. Second cmd: the input is mltb.video so this cmd will work on all videos and the output is only mltb so the extenstion is same as input files.
|
||||
3. Third cmd: the input in mltb.m4a so this cmd will work only on m4a audios and the output is mltb.mp3 so the output extension is mp3.
|
||||
4. Fourth cmd: the input is mltb.audio so this cmd will work on all audios and the output is mltb.mp3 so the output extension is mp3.
|
||||
- `NAME_SUBSTITUTE`: Add word/letter/character/sentense/pattern to remove or replace with other words with sensitive case or without. **Notes**:
|
||||
1. Seed will get disbaled while using this option
|
||||
2. Before any character you must add `\BACKSLASH`, those are the characters: `\^$.|?*+()[]{}-`
|
||||
* Example: script/code/s | mirror/leech | tea/ /s | clone | cpu/ | \[mltb\]/mltb | \\text\\/text/s
|
||||
- script will get replaced by code with sensitive case
|
||||
- mirror will get replaced by leech
|
||||
- tea will get replaced by space with sensitive case
|
||||
- clone will get removed
|
||||
- cpu will get replaced by space
|
||||
- [mltb] will get replaced by mltb
|
||||
- \text\ will get replaced by text with sensitive case
|
||||
- `NAME_SUBSTITUTE`: Add word/letter/character/sentense/pattern to remove or replace with other words with sensitive case or without. `Str`
|
||||
**Notes**:
|
||||
1. Seed will get disbaled while using this option
|
||||
2. Before any character you must add `\BACKSLASH`, those are the characters: `\^$.|?*+()[]{}-`
|
||||
* Example: script/code/s | mirror/leech | tea/ /s | clone | cpu/ | \[mltb\]/mltb | \\text\\/text/s
|
||||
- script will get replaced by code with sensitive case
|
||||
- mirror will get replaced by leech
|
||||
- tea will get replaced by space with sensitive case
|
||||
- clone will get removed
|
||||
- cpu will get replaced by space
|
||||
- [mltb] will get replaced by mltb
|
||||
- \text\ will get replaced by text with sensitive case
|
||||
|
||||
**3. GDrive Tools**
|
||||
|
||||
@ -351,13 +339,13 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
|
||||
**8. JDownloader**
|
||||
|
||||
- `JD_EMAIL`: jdownloader email sign up on [JDownloader](https://my.jdownloader.org/)
|
||||
- `JD_PASS`: jdownloader password
|
||||
- `JD_EMAIL`: jdownloader email sign up on [JDownloader](https://my.jdownloader.org/). `Str`
|
||||
- `JD_PASS`: jdownloader password. `Str`
|
||||
- **JDownloader Config**: You can use your config from local machine in bot by *zipping* cfg folder (cfg.zip) and add it in repo folder.
|
||||
|
||||
**9. Sabnzbd**
|
||||
|
||||
- `USENET_SERVERS`: list of dictionaries, you can add as much as you want and there is a button for servers in sabnzbd settings to edit current servers and add new servers.
|
||||
- `USENET_SERVERS`: list of dictionaries, you can add as much as you want and there is a button for servers in sabnzbd settings to edit current servers and add new servers. `List`
|
||||
|
||||
***[{'name': 'main', 'host': '', 'port': 563, 'timeout': 60, 'username': '', 'password': '', 'connections': 8, 'ssl': 1, 'ssl_verify': 2, 'ssl_ciphers': '', 'enable': 1, 'required': 0, 'optional': 0, 'retention': 0, 'send_group': 0, 'priority': 0}]***
|
||||
|
||||
@ -379,8 +367,8 @@ quotes, even if it's `Int`, `Bool` or `List`.
|
||||
**11. Queue System**
|
||||
|
||||
- `QUEUE_ALL`: Number of parallel tasks of downloads and uploads. For example if 20 task added and `QUEUE_ALL` is `8`,
|
||||
then the summation of uploading and downloading tasks are 8 and the rest in queue. `Int`. **NOTE**: if you want to
|
||||
fill `QUEUE_DOWNLOAD` or `QUEUE_UPLOAD`, then `QUEUE_ALL` value must be greater than or equal to the greatest one and
|
||||
then the summation of uploading and downloading tasks are 8 and the rest in queue. `Int`.
|
||||
**NOTE**: if you want to fill `QUEUE_DOWNLOAD` or `QUEUE_UPLOAD`, then `QUEUE_ALL` value must be greater than or equal to the greatest one and
|
||||
less than or equal to summation of `QUEUE_UPLOAD` and `QUEUE_DOWNLOAD`.
|
||||
- `QUEUE_DOWNLOAD`: Number of all parallel downloading tasks. `Int`
|
||||
- `QUEUE_UPLOAD`: Number of all parallel uploading tasks. `Int`
|
||||
@ -780,28 +768,19 @@ Format:
|
||||
machine host login username password my_password
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
machine instagram login anas.tayyar password mypassword
|
||||
```
|
||||
|
||||
**Instagram Note**: You must login even if you want to download public posts and after first try you must confirm that
|
||||
this was you logged in from different ip(you can confirm from phone app).
|
||||
|
||||
**Youtube Note**: For `youtube` authentication
|
||||
use [cookies.txt](https://github.com/ytdl-org/youtube-dl#how-do-i-pass-cookies-to-youtube-dl) file.
|
||||
|
||||
Using Aria2c you can also use built in feature from bot with or without username. Here example for index link without
|
||||
username.
|
||||
|
||||
```
|
||||
machine example.workers.dev password index_password
|
||||
```
|
||||
|
||||
Where host is the name of extractor (eg. instagram, Twitch). Multiple accounts of different hosts can be added each
|
||||
separated by a new line.
|
||||
|
||||
**Yt-dlp**:
|
||||
Authentication using [cookies.txt](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#how-do-i-pass-cookies-to-yt-dlp) file.
|
||||
|
||||
|
||||
-----
|
||||
|
||||
>
|
||||
|
539
bot/__init__.py
539
bot/__init__.py
@ -1,28 +1,18 @@
|
||||
from sys import exit
|
||||
from apscheduler.schedulers.asyncio import AsyncIOScheduler
|
||||
from aria2p import API as ariaAPI, Client as ariaClient
|
||||
from asyncio import Lock, new_event_loop, set_event_loop
|
||||
from dotenv import load_dotenv, dotenv_values
|
||||
from logging import (
|
||||
getLogger,
|
||||
FileHandler,
|
||||
StreamHandler,
|
||||
INFO,
|
||||
basicConfig,
|
||||
error as log_error,
|
||||
info as log_info,
|
||||
warning as log_warning,
|
||||
WARNING,
|
||||
ERROR,
|
||||
)
|
||||
from shutil import rmtree
|
||||
from os import remove, path as ospath, environ
|
||||
from pymongo.mongo_client import MongoClient
|
||||
from pymongo.server_api import ServerApi
|
||||
from pyrogram import Client as TgClient, enums
|
||||
from qbittorrentapi import Client as QbClient
|
||||
from sabnzbdapi import SabnzbdClient
|
||||
from socket import setdefaulttimeout
|
||||
from subprocess import Popen, run
|
||||
from time import time
|
||||
from tzlocal import get_localzone
|
||||
from uvloop import install
|
||||
@ -33,12 +23,12 @@ from uvloop import install
|
||||
install()
|
||||
setdefaulttimeout(600)
|
||||
|
||||
getLogger("qbittorrentapi").setLevel(INFO)
|
||||
getLogger("requests").setLevel(INFO)
|
||||
getLogger("urllib3").setLevel(INFO)
|
||||
getLogger("qbittorrentapi").setLevel(WARNING)
|
||||
getLogger("requests").setLevel(WARNING)
|
||||
getLogger("urllib3").setLevel(WARNING)
|
||||
getLogger("pyrogram").setLevel(ERROR)
|
||||
getLogger("httpx").setLevel(ERROR)
|
||||
getLogger("pymongo").setLevel(ERROR)
|
||||
getLogger("httpx").setLevel(WARNING)
|
||||
getLogger("pymongo").setLevel(WARNING)
|
||||
|
||||
bot_start_time = time()
|
||||
|
||||
@ -53,34 +43,22 @@ basicConfig(
|
||||
|
||||
LOGGER = getLogger(__name__)
|
||||
|
||||
load_dotenv("config.env", override=True)
|
||||
|
||||
intervals = {"status": {}, "qb": "", "jd": "", "nzb": "", "stopAll": False}
|
||||
qb_torrents = {}
|
||||
jd_downloads = {}
|
||||
nzb_jobs = {}
|
||||
drives_names = []
|
||||
drives_ids = []
|
||||
index_urls = []
|
||||
global_extension_filter = ["aria2", "!qB"]
|
||||
user_data = {}
|
||||
aria2_options = {}
|
||||
qbit_options = {}
|
||||
nzb_options = {}
|
||||
queued_dl = {}
|
||||
queued_up = {}
|
||||
status_dict = {}
|
||||
task_dict = {}
|
||||
rss_dict = {}
|
||||
non_queued_dl = set()
|
||||
non_queued_up = set()
|
||||
multi_tags = set()
|
||||
|
||||
try:
|
||||
if bool(environ.get("_____REMOVE_THIS_LINE_____")):
|
||||
log_error("The README.md file there to be read! Exiting now!")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
except:
|
||||
pass
|
||||
|
||||
task_dict_lock = Lock()
|
||||
queue_dict_lock = Lock()
|
||||
qb_listener_lock = Lock()
|
||||
@ -89,440 +67,12 @@ jd_lock = Lock()
|
||||
cpu_eater_lock = Lock()
|
||||
subprocess_lock = Lock()
|
||||
same_directory_lock = Lock()
|
||||
status_dict = {}
|
||||
task_dict = {}
|
||||
rss_dict = {}
|
||||
extension_filter = ["aria2", "!qB"]
|
||||
drives_names = []
|
||||
drives_ids = []
|
||||
index_urls = []
|
||||
|
||||
BOT_TOKEN = environ.get("BOT_TOKEN", "")
|
||||
if len(BOT_TOKEN) == 0:
|
||||
log_error("BOT_TOKEN variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
|
||||
BOT_ID = BOT_TOKEN.split(":", 1)[0]
|
||||
|
||||
DATABASE_URL = environ.get("DATABASE_URL", "")
|
||||
if len(DATABASE_URL) == 0:
|
||||
DATABASE_URL = ""
|
||||
|
||||
if DATABASE_URL:
|
||||
try:
|
||||
conn = MongoClient(DATABASE_URL, server_api=ServerApi("1"))
|
||||
db = conn.mltb
|
||||
current_config = dict(dotenv_values("config.env"))
|
||||
old_config = db.settings.deployConfig.find_one({"_id": BOT_ID})
|
||||
if old_config is None:
|
||||
db.settings.deployConfig.replace_one(
|
||||
{"_id": BOT_ID}, current_config, upsert=True
|
||||
)
|
||||
else:
|
||||
del old_config["_id"]
|
||||
if old_config and old_config != current_config:
|
||||
db.settings.deployConfig.replace_one(
|
||||
{"_id": BOT_ID}, current_config, upsert=True
|
||||
)
|
||||
elif config_dict := db.settings.config.find_one({"_id": BOT_ID}):
|
||||
del config_dict["_id"]
|
||||
for key, value in config_dict.items():
|
||||
environ[key] = str(value)
|
||||
if pf_dict := db.settings.files.find_one({"_id": BOT_ID}):
|
||||
del pf_dict["_id"]
|
||||
for key, value in pf_dict.items():
|
||||
if value:
|
||||
file_ = key.replace("__", ".")
|
||||
with open(file_, "wb+") as f:
|
||||
f.write(value)
|
||||
if a2c_options := db.settings.aria2c.find_one({"_id": BOT_ID}):
|
||||
del a2c_options["_id"]
|
||||
aria2_options = a2c_options
|
||||
if qbit_opt := db.settings.qbittorrent.find_one({"_id": BOT_ID}):
|
||||
del qbit_opt["_id"]
|
||||
qbit_options = qbit_opt
|
||||
if nzb_opt := db.settings.nzb.find_one({"_id": BOT_ID}):
|
||||
if ospath.exists("sabnzbd/SABnzbd.ini.bak"):
|
||||
remove("sabnzbd/SABnzbd.ini.bak")
|
||||
del nzb_opt["_id"]
|
||||
((key, value),) = nzb_opt.items()
|
||||
file_ = key.replace("__", ".")
|
||||
with open(f"sabnzbd/{file_}", "wb+") as f:
|
||||
f.write(value)
|
||||
conn.close()
|
||||
BOT_TOKEN = environ.get("BOT_TOKEN", "")
|
||||
BOT_ID = BOT_TOKEN.split(":", 1)[0]
|
||||
DATABASE_URL = environ.get("DATABASE_URL", "")
|
||||
except Exception as e:
|
||||
LOGGER.error(f"Database ERROR: {e}")
|
||||
else:
|
||||
config_dict = {}
|
||||
|
||||
if ospath.exists("cfg.zip"):
|
||||
if ospath.exists("/JDownloader/cfg"):
|
||||
rmtree("/JDownloader/cfg", ignore_errors=True)
|
||||
run(["7z", "x", "cfg.zip", "-o/JDownloader"])
|
||||
remove("cfg.zip")
|
||||
|
||||
if not ospath.exists(".netrc"):
|
||||
with open(".netrc", "w"):
|
||||
pass
|
||||
run(
|
||||
"chmod 600 .netrc && cp .netrc /root/.netrc && chmod +x aria-nox-nzb.sh && ./aria-nox-nzb.sh",
|
||||
shell=True,
|
||||
)
|
||||
|
||||
OWNER_ID = environ.get("OWNER_ID", "")
|
||||
if len(OWNER_ID) == 0:
|
||||
log_error("OWNER_ID variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
else:
|
||||
OWNER_ID = int(OWNER_ID)
|
||||
|
||||
TELEGRAM_API = environ.get("TELEGRAM_API", "")
|
||||
if len(TELEGRAM_API) == 0:
|
||||
log_error("TELEGRAM_API variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
else:
|
||||
TELEGRAM_API = int(TELEGRAM_API)
|
||||
|
||||
TELEGRAM_HASH = environ.get("TELEGRAM_HASH", "")
|
||||
if len(TELEGRAM_HASH) == 0:
|
||||
log_error("TELEGRAM_HASH variable is missing! Exiting now")
|
||||
bot_loop.stop()
|
||||
exit(1)
|
||||
|
||||
USER_SESSION_STRING = environ.get("USER_SESSION_STRING", "")
|
||||
if len(USER_SESSION_STRING) != 0:
|
||||
log_info("Creating client from USER_SESSION_STRING")
|
||||
try:
|
||||
user = TgClient(
|
||||
"user",
|
||||
TELEGRAM_API,
|
||||
TELEGRAM_HASH,
|
||||
session_string=USER_SESSION_STRING,
|
||||
parse_mode=enums.ParseMode.HTML,
|
||||
max_concurrent_transmissions=10,
|
||||
).start()
|
||||
IS_PREMIUM_USER = user.me.is_premium
|
||||
except:
|
||||
log_error("Failed to start client from USER_SESSION_STRING")
|
||||
IS_PREMIUM_USER = False
|
||||
user = ""
|
||||
else:
|
||||
IS_PREMIUM_USER = False
|
||||
user = ""
|
||||
|
||||
GDRIVE_ID = environ.get("GDRIVE_ID", "")
|
||||
if len(GDRIVE_ID) == 0:
|
||||
GDRIVE_ID = ""
|
||||
|
||||
RCLONE_PATH = environ.get("RCLONE_PATH", "")
|
||||
if len(RCLONE_PATH) == 0:
|
||||
RCLONE_PATH = ""
|
||||
|
||||
RCLONE_FLAGS = environ.get("RCLONE_FLAGS", "")
|
||||
if len(RCLONE_FLAGS) == 0:
|
||||
RCLONE_FLAGS = ""
|
||||
|
||||
DEFAULT_UPLOAD = environ.get("DEFAULT_UPLOAD", "")
|
||||
if DEFAULT_UPLOAD != "gd":
|
||||
DEFAULT_UPLOAD = "rc"
|
||||
|
||||
DOWNLOAD_DIR = environ.get("DOWNLOAD_DIR", "")
|
||||
if len(DOWNLOAD_DIR) == 0:
|
||||
DOWNLOAD_DIR = "/usr/src/app/downloads/"
|
||||
elif not DOWNLOAD_DIR.endswith("/"):
|
||||
DOWNLOAD_DIR = f"{DOWNLOAD_DIR}/"
|
||||
|
||||
AUTHORIZED_CHATS = environ.get("AUTHORIZED_CHATS", "")
|
||||
if len(AUTHORIZED_CHATS) != 0:
|
||||
aid = AUTHORIZED_CHATS.split()
|
||||
for id_ in aid:
|
||||
chat_id, *thread_ids = id_.split("|")
|
||||
chat_id = int(chat_id.strip())
|
||||
if thread_ids:
|
||||
thread_ids = list(map(lambda x: int(x.strip()), thread_ids))
|
||||
user_data[chat_id] = {"is_auth": True, "thread_ids": thread_ids}
|
||||
else:
|
||||
user_data[chat_id] = {"is_auth": True}
|
||||
|
||||
SUDO_USERS = environ.get("SUDO_USERS", "")
|
||||
if len(SUDO_USERS) != 0:
|
||||
aid = SUDO_USERS.split()
|
||||
for id_ in aid:
|
||||
user_data[int(id_.strip())] = {"is_sudo": True}
|
||||
|
||||
EXTENSION_FILTER = environ.get("EXTENSION_FILTER", "")
|
||||
if len(EXTENSION_FILTER) > 0:
|
||||
fx = EXTENSION_FILTER.split()
|
||||
for x in fx:
|
||||
x = x.lstrip(".")
|
||||
global_extension_filter.append(x.strip().lower())
|
||||
|
||||
JD_EMAIL = environ.get("JD_EMAIL", "")
|
||||
JD_PASS = environ.get("JD_PASS", "")
|
||||
if len(JD_EMAIL) == 0 or len(JD_PASS) == 0:
|
||||
JD_EMAIL = ""
|
||||
JD_PASS = ""
|
||||
|
||||
USENET_SERVERS = environ.get("USENET_SERVERS", "")
|
||||
try:
|
||||
if len(USENET_SERVERS) == 0:
|
||||
USENET_SERVERS = []
|
||||
elif (us := eval(USENET_SERVERS)) and not us[0].get("host"):
|
||||
USENET_SERVERS = []
|
||||
else:
|
||||
USENET_SERVERS = eval(USENET_SERVERS)
|
||||
except:
|
||||
log_error(f"Wrong USENET_SERVERS format: {USENET_SERVERS}")
|
||||
USENET_SERVERS = []
|
||||
|
||||
FILELION_API = environ.get("FILELION_API", "")
|
||||
if len(FILELION_API) == 0:
|
||||
FILELION_API = ""
|
||||
|
||||
STREAMWISH_API = environ.get("STREAMWISH_API", "")
|
||||
if len(STREAMWISH_API) == 0:
|
||||
STREAMWISH_API = ""
|
||||
|
||||
INDEX_URL = environ.get("INDEX_URL", "").rstrip("/")
|
||||
if len(INDEX_URL) == 0:
|
||||
INDEX_URL = ""
|
||||
|
||||
SEARCH_API_LINK = environ.get("SEARCH_API_LINK", "").rstrip("/")
|
||||
if len(SEARCH_API_LINK) == 0:
|
||||
SEARCH_API_LINK = ""
|
||||
|
||||
LEECH_FILENAME_PREFIX = environ.get("LEECH_FILENAME_PREFIX", "")
|
||||
if len(LEECH_FILENAME_PREFIX) == 0:
|
||||
LEECH_FILENAME_PREFIX = ""
|
||||
|
||||
SEARCH_PLUGINS = environ.get("SEARCH_PLUGINS", "")
|
||||
if len(SEARCH_PLUGINS) == 0:
|
||||
SEARCH_PLUGINS = ""
|
||||
else:
|
||||
try:
|
||||
SEARCH_PLUGINS = eval(SEARCH_PLUGINS)
|
||||
except:
|
||||
log_error(f"Wrong USENET_SERVERS format: {SEARCH_PLUGINS}")
|
||||
SEARCH_PLUGINS = ""
|
||||
|
||||
MAX_SPLIT_SIZE = 4194304000 if IS_PREMIUM_USER else 2097152000
|
||||
|
||||
LEECH_SPLIT_SIZE = environ.get("LEECH_SPLIT_SIZE", "")
|
||||
if (
|
||||
len(LEECH_SPLIT_SIZE) == 0
|
||||
or int(LEECH_SPLIT_SIZE) > MAX_SPLIT_SIZE
|
||||
or LEECH_SPLIT_SIZE == "2097152000"
|
||||
):
|
||||
LEECH_SPLIT_SIZE = MAX_SPLIT_SIZE
|
||||
else:
|
||||
LEECH_SPLIT_SIZE = int(LEECH_SPLIT_SIZE)
|
||||
|
||||
STATUS_UPDATE_INTERVAL = environ.get("STATUS_UPDATE_INTERVAL", "")
|
||||
if len(STATUS_UPDATE_INTERVAL) == 0:
|
||||
STATUS_UPDATE_INTERVAL = 15
|
||||
else:
|
||||
STATUS_UPDATE_INTERVAL = int(STATUS_UPDATE_INTERVAL)
|
||||
|
||||
YT_DLP_OPTIONS = environ.get("YT_DLP_OPTIONS", "")
|
||||
if len(YT_DLP_OPTIONS) == 0:
|
||||
YT_DLP_OPTIONS = ""
|
||||
|
||||
SEARCH_LIMIT = environ.get("SEARCH_LIMIT", "")
|
||||
SEARCH_LIMIT = 0 if len(SEARCH_LIMIT) == 0 else int(SEARCH_LIMIT)
|
||||
|
||||
LEECH_DUMP_CHAT = environ.get("LEECH_DUMP_CHAT", "")
|
||||
LEECH_DUMP_CHAT = "" if len(LEECH_DUMP_CHAT) == 0 else LEECH_DUMP_CHAT
|
||||
|
||||
STATUS_LIMIT = environ.get("STATUS_LIMIT", "")
|
||||
STATUS_LIMIT = 4 if len(STATUS_LIMIT) == 0 else int(STATUS_LIMIT)
|
||||
|
||||
CMD_SUFFIX = environ.get("CMD_SUFFIX", "")
|
||||
|
||||
RSS_CHAT = environ.get("RSS_CHAT", "")
|
||||
RSS_CHAT = "" if len(RSS_CHAT) == 0 else RSS_CHAT
|
||||
|
||||
RSS_DELAY = environ.get("RSS_DELAY", "")
|
||||
RSS_DELAY = 600 if len(RSS_DELAY) == 0 else int(RSS_DELAY)
|
||||
|
||||
TORRENT_TIMEOUT = environ.get("TORRENT_TIMEOUT", "")
|
||||
TORRENT_TIMEOUT = "" if len(TORRENT_TIMEOUT) == 0 else int(TORRENT_TIMEOUT)
|
||||
|
||||
QUEUE_ALL = environ.get("QUEUE_ALL", "")
|
||||
QUEUE_ALL = "" if len(QUEUE_ALL) == 0 else int(QUEUE_ALL)
|
||||
|
||||
QUEUE_DOWNLOAD = environ.get("QUEUE_DOWNLOAD", "")
|
||||
QUEUE_DOWNLOAD = "" if len(QUEUE_DOWNLOAD) == 0 else int(QUEUE_DOWNLOAD)
|
||||
|
||||
QUEUE_UPLOAD = environ.get("QUEUE_UPLOAD", "")
|
||||
QUEUE_UPLOAD = "" if len(QUEUE_UPLOAD) == 0 else int(QUEUE_UPLOAD)
|
||||
|
||||
INCOMPLETE_TASK_NOTIFIER = environ.get("INCOMPLETE_TASK_NOTIFIER", "")
|
||||
INCOMPLETE_TASK_NOTIFIER = INCOMPLETE_TASK_NOTIFIER.lower() == "true"
|
||||
|
||||
STOP_DUPLICATE = environ.get("STOP_DUPLICATE", "")
|
||||
STOP_DUPLICATE = STOP_DUPLICATE.lower() == "true"
|
||||
|
||||
IS_TEAM_DRIVE = environ.get("IS_TEAM_DRIVE", "")
|
||||
IS_TEAM_DRIVE = IS_TEAM_DRIVE.lower() == "true"
|
||||
|
||||
USE_SERVICE_ACCOUNTS = environ.get("USE_SERVICE_ACCOUNTS", "")
|
||||
USE_SERVICE_ACCOUNTS = USE_SERVICE_ACCOUNTS.lower() == "true"
|
||||
|
||||
WEB_PINCODE = environ.get("WEB_PINCODE", "")
|
||||
WEB_PINCODE = WEB_PINCODE.lower() == "true"
|
||||
|
||||
AS_DOCUMENT = environ.get("AS_DOCUMENT", "")
|
||||
AS_DOCUMENT = AS_DOCUMENT.lower() == "true"
|
||||
|
||||
EQUAL_SPLITS = environ.get("EQUAL_SPLITS", "")
|
||||
EQUAL_SPLITS = EQUAL_SPLITS.lower() == "true"
|
||||
|
||||
MEDIA_GROUP = environ.get("MEDIA_GROUP", "")
|
||||
MEDIA_GROUP = MEDIA_GROUP.lower() == "true"
|
||||
|
||||
USER_TRANSMISSION = environ.get("USER_TRANSMISSION", "")
|
||||
USER_TRANSMISSION = USER_TRANSMISSION.lower() == "true" and IS_PREMIUM_USER
|
||||
|
||||
BASE_URL_PORT = environ.get("BASE_URL_PORT", "")
|
||||
BASE_URL_PORT = 80 if len(BASE_URL_PORT) == 0 else int(BASE_URL_PORT)
|
||||
|
||||
BASE_URL = environ.get("BASE_URL", "").rstrip("/")
|
||||
if len(BASE_URL) == 0:
|
||||
log_warning("BASE_URL not provided!")
|
||||
BASE_URL = ""
|
||||
|
||||
UPSTREAM_REPO = environ.get("UPSTREAM_REPO", "")
|
||||
if len(UPSTREAM_REPO) == 0:
|
||||
UPSTREAM_REPO = ""
|
||||
|
||||
UPSTREAM_BRANCH = environ.get("UPSTREAM_BRANCH", "")
|
||||
if len(UPSTREAM_BRANCH) == 0:
|
||||
UPSTREAM_BRANCH = "master"
|
||||
|
||||
RCLONE_SERVE_URL = environ.get("RCLONE_SERVE_URL", "").rstrip("/")
|
||||
if len(RCLONE_SERVE_URL) == 0:
|
||||
RCLONE_SERVE_URL = ""
|
||||
|
||||
RCLONE_SERVE_PORT = environ.get("RCLONE_SERVE_PORT", "")
|
||||
RCLONE_SERVE_PORT = 8080 if len(RCLONE_SERVE_PORT) == 0 else int(RCLONE_SERVE_PORT)
|
||||
|
||||
RCLONE_SERVE_USER = environ.get("RCLONE_SERVE_USER", "")
|
||||
if len(RCLONE_SERVE_USER) == 0:
|
||||
RCLONE_SERVE_USER = ""
|
||||
|
||||
RCLONE_SERVE_PASS = environ.get("RCLONE_SERVE_PASS", "")
|
||||
if len(RCLONE_SERVE_PASS) == 0:
|
||||
RCLONE_SERVE_PASS = ""
|
||||
|
||||
NAME_SUBSTITUTE = environ.get("NAME_SUBSTITUTE", "")
|
||||
NAME_SUBSTITUTE = "" if len(NAME_SUBSTITUTE) == 0 else NAME_SUBSTITUTE
|
||||
|
||||
MIXED_LEECH = environ.get("MIXED_LEECH", "")
|
||||
MIXED_LEECH = MIXED_LEECH.lower() == "true" and IS_PREMIUM_USER
|
||||
|
||||
THUMBNAIL_LAYOUT = environ.get("THUMBNAIL_LAYOUT", "")
|
||||
THUMBNAIL_LAYOUT = "" if len(THUMBNAIL_LAYOUT) == 0 else THUMBNAIL_LAYOUT
|
||||
|
||||
FFMPEG_CMDS = environ.get("FFMPEG_CMDS", "")
|
||||
try:
|
||||
FFMPEG_CMDS = [] if len(FFMPEG_CMDS) == 0 else eval(FFMPEG_CMDS)
|
||||
except:
|
||||
log_error(f"Wrong FFMPEG_CMDS format: {FFMPEG_CMDS}")
|
||||
FFMPEG_CMDS = []
|
||||
|
||||
config_dict = {
|
||||
"AS_DOCUMENT": AS_DOCUMENT,
|
||||
"AUTHORIZED_CHATS": AUTHORIZED_CHATS,
|
||||
"BASE_URL": BASE_URL,
|
||||
"BASE_URL_PORT": BASE_URL_PORT,
|
||||
"BOT_TOKEN": BOT_TOKEN,
|
||||
"CMD_SUFFIX": CMD_SUFFIX,
|
||||
"DATABASE_URL": DATABASE_URL,
|
||||
"DEFAULT_UPLOAD": DEFAULT_UPLOAD,
|
||||
"DOWNLOAD_DIR": DOWNLOAD_DIR,
|
||||
"EQUAL_SPLITS": EQUAL_SPLITS,
|
||||
"EXTENSION_FILTER": EXTENSION_FILTER,
|
||||
"FFMPEG_CMDS": FFMPEG_CMDS,
|
||||
"FILELION_API": FILELION_API,
|
||||
"GDRIVE_ID": GDRIVE_ID,
|
||||
"INCOMPLETE_TASK_NOTIFIER": INCOMPLETE_TASK_NOTIFIER,
|
||||
"INDEX_URL": INDEX_URL,
|
||||
"IS_TEAM_DRIVE": IS_TEAM_DRIVE,
|
||||
"JD_EMAIL": JD_EMAIL,
|
||||
"JD_PASS": JD_PASS,
|
||||
"LEECH_DUMP_CHAT": LEECH_DUMP_CHAT,
|
||||
"LEECH_FILENAME_PREFIX": LEECH_FILENAME_PREFIX,
|
||||
"LEECH_SPLIT_SIZE": LEECH_SPLIT_SIZE,
|
||||
"MEDIA_GROUP": MEDIA_GROUP,
|
||||
"MIXED_LEECH": MIXED_LEECH,
|
||||
"NAME_SUBSTITUTE": NAME_SUBSTITUTE,
|
||||
"OWNER_ID": OWNER_ID,
|
||||
"QUEUE_ALL": QUEUE_ALL,
|
||||
"QUEUE_DOWNLOAD": QUEUE_DOWNLOAD,
|
||||
"QUEUE_UPLOAD": QUEUE_UPLOAD,
|
||||
"RCLONE_FLAGS": RCLONE_FLAGS,
|
||||
"RCLONE_PATH": RCLONE_PATH,
|
||||
"RCLONE_SERVE_URL": RCLONE_SERVE_URL,
|
||||
"RCLONE_SERVE_USER": RCLONE_SERVE_USER,
|
||||
"RCLONE_SERVE_PASS": RCLONE_SERVE_PASS,
|
||||
"RCLONE_SERVE_PORT": RCLONE_SERVE_PORT,
|
||||
"RSS_CHAT": RSS_CHAT,
|
||||
"RSS_DELAY": RSS_DELAY,
|
||||
"SEARCH_API_LINK": SEARCH_API_LINK,
|
||||
"SEARCH_LIMIT": SEARCH_LIMIT,
|
||||
"SEARCH_PLUGINS": SEARCH_PLUGINS,
|
||||
"STATUS_LIMIT": STATUS_LIMIT,
|
||||
"STATUS_UPDATE_INTERVAL": STATUS_UPDATE_INTERVAL,
|
||||
"STOP_DUPLICATE": STOP_DUPLICATE,
|
||||
"STREAMWISH_API": STREAMWISH_API,
|
||||
"SUDO_USERS": SUDO_USERS,
|
||||
"TELEGRAM_API": TELEGRAM_API,
|
||||
"TELEGRAM_HASH": TELEGRAM_HASH,
|
||||
"THUMBNAIL_LAYOUT": THUMBNAIL_LAYOUT,
|
||||
"TORRENT_TIMEOUT": TORRENT_TIMEOUT,
|
||||
"USER_TRANSMISSION": USER_TRANSMISSION,
|
||||
"UPSTREAM_REPO": UPSTREAM_REPO,
|
||||
"UPSTREAM_BRANCH": UPSTREAM_BRANCH,
|
||||
"USENET_SERVERS": USENET_SERVERS,
|
||||
"USER_SESSION_STRING": USER_SESSION_STRING,
|
||||
"USE_SERVICE_ACCOUNTS": USE_SERVICE_ACCOUNTS,
|
||||
"WEB_PINCODE": WEB_PINCODE,
|
||||
"YT_DLP_OPTIONS": YT_DLP_OPTIONS,
|
||||
}
|
||||
|
||||
if GDRIVE_ID:
|
||||
drives_names.append("Main")
|
||||
drives_ids.append(GDRIVE_ID)
|
||||
index_urls.append(INDEX_URL)
|
||||
|
||||
if ospath.exists("list_drives.txt"):
|
||||
with open("list_drives.txt", "r+") as f:
|
||||
lines = f.readlines()
|
||||
for line in lines:
|
||||
temp = line.strip().split()
|
||||
drives_ids.append(temp[1])
|
||||
drives_names.append(temp[0].replace("_", " "))
|
||||
if len(temp) > 2:
|
||||
index_urls.append(temp[2])
|
||||
else:
|
||||
index_urls.append("")
|
||||
|
||||
if BASE_URL:
|
||||
Popen(
|
||||
f"gunicorn web.wserver:app --bind 0.0.0.0:{BASE_URL_PORT} --worker-class gevent",
|
||||
shell=True,
|
||||
)
|
||||
|
||||
if ospath.exists("accounts.zip"):
|
||||
if ospath.exists("accounts"):
|
||||
rmtree("accounts")
|
||||
run(["7z", "x", "-o.", "-aoa", "accounts.zip", "accounts/*.json"])
|
||||
run(["chmod", "-R", "777", "accounts"])
|
||||
remove("accounts.zip")
|
||||
if not ospath.exists("accounts"):
|
||||
config_dict["USE_SERVICE_ACCOUNTS"] = False
|
||||
aria2 = ariaAPI(ariaClient(host="http://localhost", port=6800, secret=""))
|
||||
|
||||
qbittorrent_client = QbClient(
|
||||
host="localhost",
|
||||
@ -542,65 +92,4 @@ sabnzbd_client = SabnzbdClient(
|
||||
port="8070",
|
||||
)
|
||||
|
||||
|
||||
aria2c_global = [
|
||||
"bt-max-open-files",
|
||||
"download-result",
|
||||
"keep-unfinished-download-result",
|
||||
"log",
|
||||
"log-level",
|
||||
"max-concurrent-downloads",
|
||||
"max-download-result",
|
||||
"max-overall-download-limit",
|
||||
"save-session",
|
||||
"max-overall-upload-limit",
|
||||
"optimize-concurrent-downloads",
|
||||
"save-cookies",
|
||||
"server-stat-of",
|
||||
]
|
||||
|
||||
log_info("Creating client from BOT_TOKEN")
|
||||
bot = TgClient(
|
||||
"bot",
|
||||
TELEGRAM_API,
|
||||
TELEGRAM_HASH,
|
||||
bot_token=BOT_TOKEN,
|
||||
parse_mode=enums.ParseMode.HTML,
|
||||
max_concurrent_transmissions=10,
|
||||
).start()
|
||||
bot_name = bot.me.username
|
||||
|
||||
scheduler = AsyncIOScheduler(timezone=str(get_localzone()), event_loop=bot_loop)
|
||||
|
||||
|
||||
def get_qb_options():
|
||||
global qbit_options
|
||||
if not qbit_options:
|
||||
qbit_options = dict(qbittorrent_client.app_preferences())
|
||||
del qbit_options["listen_port"]
|
||||
for k in list(qbit_options.keys()):
|
||||
if k.startswith("rss"):
|
||||
del qbit_options[k]
|
||||
qbittorrent_client.app_set_preferences({"web_ui_password": "mltbmltb"})
|
||||
else:
|
||||
qbit_options["web_ui_password"] = "mltbmltb"
|
||||
qb_opt = {**qbit_options}
|
||||
qbittorrent_client.app_set_preferences(qb_opt)
|
||||
|
||||
|
||||
get_qb_options()
|
||||
|
||||
aria2 = ariaAPI(ariaClient(host="http://localhost", port=6800, secret=""))
|
||||
if not aria2_options:
|
||||
aria2_options = aria2.client.get_global_option()
|
||||
else:
|
||||
a2c_glo = {op: aria2_options[op] for op in aria2c_global if op in aria2_options}
|
||||
aria2.set_global_options(a2c_glo)
|
||||
|
||||
|
||||
async def get_nzb_options():
|
||||
global nzb_options
|
||||
nzb_options = (await sabnzbd_client.get_config())["config"]["misc"]
|
||||
|
||||
|
||||
bot_loop.run_until_complete(get_nzb_options())
|
||||
|
330
bot/__main__.py
330
bot/__main__.py
@ -1,322 +1,56 @@
|
||||
from aiofiles import open as aiopen
|
||||
from aiofiles.os import path as aiopath, remove
|
||||
from asyncio import gather, create_subprocess_exec
|
||||
from os import execl as osexecl
|
||||
from psutil import (
|
||||
disk_usage,
|
||||
cpu_percent,
|
||||
swap_memory,
|
||||
cpu_count,
|
||||
virtual_memory,
|
||||
net_io_counters,
|
||||
boot_time,
|
||||
)
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
from signal import signal, SIGINT
|
||||
from sys import executable
|
||||
from time import time
|
||||
from asyncio import gather
|
||||
|
||||
from bot import (
|
||||
bot,
|
||||
bot_start_time,
|
||||
LOGGER,
|
||||
intervals,
|
||||
config_dict,
|
||||
scheduler,
|
||||
sabnzbd_client,
|
||||
from .core.config_manager import Config
|
||||
|
||||
Config.load()
|
||||
|
||||
from . import LOGGER, bot_loop
|
||||
from .core.mltb_client import TgClient
|
||||
from .core.handlers import add_handlers
|
||||
from .core.startup import (
|
||||
load_settings,
|
||||
load_configurations,
|
||||
save_settings,
|
||||
update_aria2_options,
|
||||
update_nzb_options,
|
||||
update_qb_options,
|
||||
update_variables,
|
||||
)
|
||||
from .helper.ext_utils.telegraph_helper import telegraph
|
||||
from .helper.ext_utils.bot_utils import (
|
||||
cmd_exec,
|
||||
sync_to_async,
|
||||
create_help_buttons,
|
||||
new_task,
|
||||
)
|
||||
from .helper.ext_utils.db_handler import database
|
||||
from .helper.ext_utils.bot_utils import sync_to_async, create_help_buttons
|
||||
from .helper.ext_utils.files_utils import clean_all, exit_clean_up
|
||||
from .helper.ext_utils.jdownloader_booter import jdownloader
|
||||
from .helper.ext_utils.status_utils import get_readable_file_size, get_readable_time
|
||||
from .helper.listeners.aria2_listener import start_aria2_listener
|
||||
from .helper.mirror_leech_utils.rclone_utils.serve import rclone_serve_booter
|
||||
from .helper.telegram_helper.bot_commands import BotCommands
|
||||
from .helper.telegram_helper.button_build import ButtonMaker
|
||||
from .helper.telegram_helper.filters import CustomFilters
|
||||
from .helper.telegram_helper.message_utils import send_message, edit_message, send_file
|
||||
from .modules import (
|
||||
authorize,
|
||||
cancel_task,
|
||||
clone,
|
||||
exec,
|
||||
file_selector,
|
||||
gd_count,
|
||||
gd_delete,
|
||||
gd_search,
|
||||
mirror_leech,
|
||||
status,
|
||||
ytdlp,
|
||||
shell,
|
||||
users_settings,
|
||||
bot_settings,
|
||||
help,
|
||||
force_start,
|
||||
)
|
||||
|
||||
|
||||
@new_task
|
||||
async def stats(_, message):
|
||||
if await aiopath.exists(".git"):
|
||||
last_commit = await cmd_exec(
|
||||
"git log -1 --date=short --pretty=format:'%cd <b>From</b> %cr'", True
|
||||
)
|
||||
last_commit = last_commit[0]
|
||||
else:
|
||||
last_commit = "No UPSTREAM_REPO"
|
||||
total, used, free, disk = disk_usage("/")
|
||||
swap = swap_memory()
|
||||
memory = virtual_memory()
|
||||
stats = (
|
||||
f"<b>Commit Date:</b> {last_commit}\n\n"
|
||||
f"<b>Bot Uptime:</b> {get_readable_time(time() - bot_start_time)}\n"
|
||||
f"<b>OS Uptime:</b> {get_readable_time(time() - boot_time())}\n\n"
|
||||
f"<b>Total Disk Space:</b> {get_readable_file_size(total)}\n"
|
||||
f"<b>Used:</b> {get_readable_file_size(used)} | <b>Free:</b> {get_readable_file_size(free)}\n\n"
|
||||
f"<b>Upload:</b> {get_readable_file_size(net_io_counters().bytes_sent)}\n"
|
||||
f"<b>Download:</b> {get_readable_file_size(net_io_counters().bytes_recv)}\n\n"
|
||||
f"<b>CPU:</b> {cpu_percent(interval=0.5)}%\n"
|
||||
f"<b>RAM:</b> {memory.percent}%\n"
|
||||
f"<b>DISK:</b> {disk}%\n\n"
|
||||
f"<b>Physical Cores:</b> {cpu_count(logical=False)}\n"
|
||||
f"<b>Total Cores:</b> {cpu_count(logical=True)}\n\n"
|
||||
f"<b>SWAP:</b> {get_readable_file_size(swap.total)} | <b>Used:</b> {swap.percent}%\n"
|
||||
f"<b>Memory Total:</b> {get_readable_file_size(memory.total)}\n"
|
||||
f"<b>Memory Free:</b> {get_readable_file_size(memory.available)}\n"
|
||||
f"<b>Memory Used:</b> {get_readable_file_size(memory.used)}\n"
|
||||
)
|
||||
await send_message(message, stats)
|
||||
|
||||
|
||||
@new_task
|
||||
async def start(client, message):
|
||||
buttons = ButtonMaker()
|
||||
buttons.url_button(
|
||||
"Repo", "https://www.github.com/anasty17/mirror-leech-telegram-bot"
|
||||
)
|
||||
buttons.url_button("Code Owner", "https://t.me/anas_tayyar")
|
||||
reply_markup = buttons.build_menu(2)
|
||||
if await CustomFilters.authorized(client, message):
|
||||
start_string = f"""
|
||||
This bot can mirror all your links|files|torrents to Google Drive or any rclone cloud or to telegram.
|
||||
Type /{BotCommands.HelpCommand} to get a list of available commands
|
||||
"""
|
||||
await send_message(message, start_string, reply_markup)
|
||||
else:
|
||||
await send_message(
|
||||
message,
|
||||
"You Are not authorized user! Deploy your own mirror-leech bot",
|
||||
reply_markup,
|
||||
)
|
||||
|
||||
|
||||
@new_task
|
||||
async def restart(_, message):
|
||||
intervals["stopAll"] = True
|
||||
restart_message = await send_message(message, "Restarting...")
|
||||
if scheduler.running:
|
||||
scheduler.shutdown(wait=False)
|
||||
if qb := intervals["qb"]:
|
||||
qb.cancel()
|
||||
if jd := intervals["jd"]:
|
||||
jd.cancel()
|
||||
if nzb := intervals["nzb"]:
|
||||
nzb.cancel()
|
||||
if st := intervals["status"]:
|
||||
for intvl in list(st.values()):
|
||||
intvl.cancel()
|
||||
await sync_to_async(clean_all)
|
||||
if sabnzbd_client.LOGGED_IN:
|
||||
await gather(
|
||||
sabnzbd_client.pause_all(),
|
||||
sabnzbd_client.purge_all(True),
|
||||
sabnzbd_client.delete_history("all", delete_files=True),
|
||||
)
|
||||
proc1 = await create_subprocess_exec(
|
||||
"pkill",
|
||||
"-9",
|
||||
"-f",
|
||||
"gunicorn|aria2c|qbittorrent-nox|ffmpeg|rclone|java|sabnzbdplus",
|
||||
)
|
||||
proc2 = await create_subprocess_exec("python3", "update.py")
|
||||
await gather(proc1.wait(), proc2.wait())
|
||||
async with aiopen(".restartmsg", "w") as f:
|
||||
await f.write(f"{restart_message.chat.id}\n{restart_message.id}\n")
|
||||
osexecl(executable, executable, "-m", "bot")
|
||||
|
||||
|
||||
@new_task
|
||||
async def ping(_, message):
|
||||
start_time = int(round(time() * 1000))
|
||||
reply = await send_message(message, "Starting Ping")
|
||||
end_time = int(round(time() * 1000))
|
||||
await edit_message(reply, f"{end_time - start_time} ms")
|
||||
|
||||
|
||||
@new_task
|
||||
async def log(_, message):
|
||||
await send_file(message, "log.txt")
|
||||
|
||||
|
||||
help_string = f"""
|
||||
NOTE: Try each command without any argument to see more detalis.
|
||||
/{BotCommands.MirrorCommand[0]} or /{BotCommands.MirrorCommand[1]}: Start mirroring to cloud.
|
||||
/{BotCommands.QbMirrorCommand[0]} or /{BotCommands.QbMirrorCommand[1]}: Start Mirroring to cloud using qBittorrent.
|
||||
/{BotCommands.JdMirrorCommand[0]} or /{BotCommands.JdMirrorCommand[1]}: Start Mirroring to cloud using JDownloader.
|
||||
/{BotCommands.NzbMirrorCommand[0]} or /{BotCommands.NzbMirrorCommand[1]}: Start Mirroring to cloud using Sabnzbd.
|
||||
/{BotCommands.YtdlCommand[0]} or /{BotCommands.YtdlCommand[1]}: Mirror yt-dlp supported link.
|
||||
/{BotCommands.LeechCommand[0]} or /{BotCommands.LeechCommand[1]}: Start leeching to Telegram.
|
||||
/{BotCommands.QbLeechCommand[0]} or /{BotCommands.QbLeechCommand[1]}: Start leeching using qBittorrent.
|
||||
/{BotCommands.JdLeechCommand[0]} or /{BotCommands.JdLeechCommand[1]}: Start leeching using JDownloader.
|
||||
/{BotCommands.NzbLeechCommand[0]} or /{BotCommands.NzbLeechCommand[1]}: Start leeching using Sabnzbd.
|
||||
/{BotCommands.YtdlLeechCommand[0]} or /{BotCommands.YtdlLeechCommand[1]}: Leech yt-dlp supported link.
|
||||
/{BotCommands.CloneCommand} [drive_url]: Copy file/folder to Google Drive.
|
||||
/{BotCommands.CountCommand} [drive_url]: Count file/folder of Google Drive.
|
||||
/{BotCommands.DeleteCommand} [drive_url]: Delete file/folder from Google Drive (Only Owner & Sudo).
|
||||
/{BotCommands.UserSetCommand[0]} or /{BotCommands.UserSetCommand[1]} [query]: Users settings.
|
||||
/{BotCommands.BotSetCommand[0]} or /{BotCommands.BotSetCommand[1]} [query]: Bot settings.
|
||||
/{BotCommands.SelectCommand}: Select files from torrents or nzb by gid or reply.
|
||||
/{BotCommands.CancelTaskCommand[0]} or /{BotCommands.CancelTaskCommand[1]} [gid]: Cancel task by gid or reply.
|
||||
/{BotCommands.ForceStartCommand[0]} or /{BotCommands.ForceStartCommand[1]} [gid]: Force start task by gid or reply.
|
||||
/{BotCommands.CancelAllCommand} [query]: Cancel all [status] tasks.
|
||||
/{BotCommands.ListCommand} [query]: Search in Google Drive(s).
|
||||
/{BotCommands.SearchCommand} [query]: Search for torrents with API.
|
||||
/{BotCommands.StatusCommand}: Shows a status of all the downloads.
|
||||
/{BotCommands.StatsCommand}: Show stats of the machine where the bot is hosted in.
|
||||
/{BotCommands.PingCommand}: Check how long it takes to Ping the Bot (Only Owner & Sudo).
|
||||
/{BotCommands.AuthorizeCommand}: Authorize a chat or a user to use the bot (Only Owner & Sudo).
|
||||
/{BotCommands.UnAuthorizeCommand}: Unauthorize a chat or a user to use the bot (Only Owner & Sudo).
|
||||
/{BotCommands.UsersCommand}: show users settings (Only Owner & Sudo).
|
||||
/{BotCommands.AddSudoCommand}: Add sudo user (Only Owner).
|
||||
/{BotCommands.RmSudoCommand}: Remove sudo users (Only Owner).
|
||||
/{BotCommands.RestartCommand}: Restart and update the bot (Only Owner & Sudo).
|
||||
/{BotCommands.LogCommand}: Get a log file of the bot. Handy for getting crash reports (Only Owner & Sudo).
|
||||
/{BotCommands.ShellCommand}: Run shell commands (Only Owner).
|
||||
/{BotCommands.AExecCommand}: Exec async functions (Only Owner).
|
||||
/{BotCommands.ExecCommand}: Exec sync functions (Only Owner).
|
||||
/{BotCommands.ClearLocalsCommand}: Clear {BotCommands.AExecCommand} or {BotCommands.ExecCommand} locals (Only Owner).
|
||||
/{BotCommands.RssCommand}: RSS Menu.
|
||||
"""
|
||||
|
||||
|
||||
@new_task
|
||||
async def bot_help(_, message):
|
||||
await send_message(message, help_string)
|
||||
|
||||
|
||||
async def restart_notification():
|
||||
if await aiopath.isfile(".restartmsg"):
|
||||
with open(".restartmsg") as f:
|
||||
chat_id, msg_id = map(int, f)
|
||||
else:
|
||||
chat_id, msg_id = 0, 0
|
||||
|
||||
async def send_incomplete_task_message(cid, msg):
|
||||
try:
|
||||
if msg.startswith("Restarted Successfully!"):
|
||||
await bot.edit_message_text(
|
||||
chat_id=chat_id, message_id=msg_id, text=msg
|
||||
)
|
||||
await remove(".restartmsg")
|
||||
else:
|
||||
await bot.send_message(
|
||||
chat_id=cid,
|
||||
text=msg,
|
||||
disable_web_page_preview=True,
|
||||
disable_notification=True,
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
|
||||
if config_dict["INCOMPLETE_TASK_NOTIFIER"] and config_dict["DATABASE_URL"]:
|
||||
if notifier_dict := await database.get_incomplete_tasks():
|
||||
for cid, data in notifier_dict.items():
|
||||
msg = "Restarted Successfully!" if cid == chat_id else "Bot Restarted!"
|
||||
for tag, links in data.items():
|
||||
msg += f"\n\n{tag}: "
|
||||
for index, link in enumerate(links, start=1):
|
||||
msg += f" <a href='{link}'>{index}</a> |"
|
||||
if len(msg.encode()) > 4000:
|
||||
await send_incomplete_task_message(cid, msg)
|
||||
msg = ""
|
||||
if msg:
|
||||
await send_incomplete_task_message(cid, msg)
|
||||
|
||||
if await aiopath.isfile(".restartmsg"):
|
||||
try:
|
||||
await bot.edit_message_text(
|
||||
chat_id=chat_id, message_id=msg_id, text="Restarted Successfully!"
|
||||
)
|
||||
except:
|
||||
pass
|
||||
await remove(".restartmsg")
|
||||
from .modules import initiate_search_tools, get_packages_version, restart_notification
|
||||
|
||||
|
||||
async def main():
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.db_load()
|
||||
await load_settings()
|
||||
await gather(TgClient.start_bot(), TgClient.start_user())
|
||||
await gather(load_configurations(), update_variables())
|
||||
await gather(
|
||||
sync_to_async(update_qb_options),
|
||||
sync_to_async(update_aria2_options),
|
||||
update_nzb_options(),
|
||||
)
|
||||
await gather(
|
||||
save_settings(),
|
||||
jdownloader.boot(),
|
||||
sync_to_async(clean_all),
|
||||
bot_settings.initiate_search_tools(),
|
||||
initiate_search_tools(),
|
||||
get_packages_version(),
|
||||
restart_notification(),
|
||||
telegraph.create_account(),
|
||||
rclone_serve_booter(),
|
||||
sync_to_async(start_aria2_listener, wait=False),
|
||||
)
|
||||
create_help_buttons()
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
start, filters=command(BotCommands.StartCommand, case_sensitive=True)
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
log,
|
||||
filters=command(BotCommands.LogCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
restart,
|
||||
filters=command(BotCommands.RestartCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
ping,
|
||||
filters=command(BotCommands.PingCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
bot_help,
|
||||
filters=command(BotCommands.HelpCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
stats,
|
||||
filters=command(BotCommands.StatsCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
add_handlers()
|
||||
LOGGER.info("Bot Started!")
|
||||
signal(SIGINT, exit_clean_up)
|
||||
|
||||
|
||||
bot.loop.run_until_complete(main())
|
||||
bot.loop.run_forever()
|
||||
bot_loop.run_until_complete(main())
|
||||
bot_loop.run_forever()
|
||||
|
1
bot/core/__init__.py
Normal file
1
bot/core/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
|
127
bot/core/config_manager.py
Normal file
127
bot/core/config_manager.py
Normal file
@ -0,0 +1,127 @@
|
||||
from importlib import import_module
|
||||
|
||||
|
||||
class Config:
|
||||
AS_DOCUMENT = False
|
||||
AUTHORIZED_CHATS = ""
|
||||
BASE_URL = ""
|
||||
BASE_URL_PORT = 80
|
||||
BOT_TOKEN = ""
|
||||
CMD_SUFFIX = ""
|
||||
DATABASE_URL = ""
|
||||
DEFAULT_UPLOAD = "rc"
|
||||
DOWNLOAD_DIR = "/usr/src/app/downloads/"
|
||||
EQUAL_SPLITS = False
|
||||
EXTENSION_FILTER = ""
|
||||
FFMPEG_CMDS = []
|
||||
FILELION_API = ""
|
||||
GDRIVE_ID = ""
|
||||
INCOMPLETE_TASK_NOTIFIER = False
|
||||
INDEX_URL = ""
|
||||
IS_TEAM_DRIVE = False
|
||||
JD_EMAIL = ""
|
||||
JD_PASS = ""
|
||||
LEECH_DUMP_CHAT = ""
|
||||
LEECH_FILENAME_PREFIX = ""
|
||||
LEECH_SPLIT_SIZE = 2097152000
|
||||
MEDIA_GROUP = False
|
||||
MIXED_LEECH = False
|
||||
NAME_SUBSTITUTE = ""
|
||||
OWNER_ID = 0
|
||||
QUEUE_ALL = 0
|
||||
QUEUE_DOWNLOAD = 0
|
||||
QUEUE_UPLOAD = 0
|
||||
RCLONE_FLAGS = ""
|
||||
RCLONE_PATH = ""
|
||||
RCLONE_SERVE_URL = ""
|
||||
RCLONE_SERVE_USER = ""
|
||||
RCLONE_SERVE_PASS = ""
|
||||
RCLONE_SERVE_PORT = 8080
|
||||
RSS_CHAT = ""
|
||||
RSS_DELAY = 600
|
||||
SEARCH_API_LINK = ""
|
||||
SEARCH_LIMIT = 0
|
||||
SEARCH_PLUGINS = []
|
||||
STATUS_LIMIT = 10
|
||||
STATUS_UPDATE_INTERVAL = 15
|
||||
STOP_DUPLICATE = False
|
||||
STREAMWISH_API = ""
|
||||
SUDO_USERS = ""
|
||||
TELEGRAM_API = 0
|
||||
TELEGRAM_HASH = ""
|
||||
THUMBNAIL_LAYOUT = ""
|
||||
TORRENT_TIMEOUT = 0
|
||||
USER_TRANSMISSION = False
|
||||
UPSTREAM_REPO = ""
|
||||
UPSTREAM_BRANCH = "master"
|
||||
USENET_SERVERS = []
|
||||
USER_SESSION_STRING = ""
|
||||
USE_SERVICE_ACCOUNTS = False
|
||||
WEB_PINCODE = False
|
||||
YT_DLP_OPTIONS = ""
|
||||
|
||||
@classmethod
|
||||
def get(cls, key):
|
||||
if hasattr(cls, key):
|
||||
return getattr(cls, key)
|
||||
raise KeyError(f"{key} is not a valid configuration key.")
|
||||
|
||||
@classmethod
|
||||
def set(cls, key, value):
|
||||
if hasattr(cls, key):
|
||||
setattr(cls, key, value)
|
||||
else:
|
||||
raise KeyError(f"{key} is not a valid configuration key.")
|
||||
|
||||
@classmethod
|
||||
def get_all(cls):
|
||||
return {
|
||||
key: getattr(cls, key)
|
||||
for key in cls.__dict__.keys()
|
||||
if not key.startswith("__") and not callable(getattr(cls, key))
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def load(cls):
|
||||
settings = import_module("config")
|
||||
for attr in dir(settings):
|
||||
if hasattr(cls, attr):
|
||||
value = getattr(settings, attr)
|
||||
if isinstance(value, str):
|
||||
value = value.strip()
|
||||
if not value:
|
||||
continue
|
||||
if attr == "DEFAULT_UPLOAD" and value != "rc":
|
||||
value = "gd"
|
||||
elif attr == "DOWNLOAD_DIR" and not value.endswith("/"):
|
||||
value = f"{value}/"
|
||||
elif attr == "USENET_SERVERS":
|
||||
if not value[0].get("host"):
|
||||
continue
|
||||
setattr(cls, attr, value)
|
||||
for key in ["BOT_TOKEN", "OWNER_ID", "TELEGRAM_API", "TELEGRAM_HASH"]:
|
||||
value = getattr(cls, key)
|
||||
if isinstance(value, str):
|
||||
value = value.strip()
|
||||
if not value:
|
||||
raise ValueError(f"{key} variable is missing!")
|
||||
|
||||
@classmethod
|
||||
def load_dict(cls, config_dict):
|
||||
for key, value in config_dict.items():
|
||||
if hasattr(cls, key):
|
||||
if key == "DEFAULT_UPLOAD" and value != "rc":
|
||||
value = "gd"
|
||||
elif key == "DOWNLOAD_DIR":
|
||||
if not value.endswith("/"):
|
||||
value = f"{value}/"
|
||||
elif key == "USENET_SERVERS":
|
||||
if not value[0].get("host"):
|
||||
continue
|
||||
setattr(cls, key, value)
|
||||
for key in ["BOT_TOKEN", "OWNER_ID", "TELEGRAM_API", "TELEGRAM_HASH"]:
|
||||
value = getattr(cls, key)
|
||||
if isinstance(value, str):
|
||||
value = value.strip()
|
||||
if not value:
|
||||
raise ValueError(f"{key} variable is missing!")
|
309
bot/core/handlers.py
Normal file
309
bot/core/handlers.py
Normal file
@ -0,0 +1,309 @@
|
||||
from pyrogram.filters import command, regex
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler, EditedMessageHandler
|
||||
|
||||
from ..modules import *
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from .mltb_client import TgClient
|
||||
|
||||
|
||||
def add_handlers():
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
authorize,
|
||||
filters=command(BotCommands.AuthorizeCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
unauthorize,
|
||||
filters=command(BotCommands.UnAuthorizeCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
add_sudo,
|
||||
filters=command(BotCommands.AddSudoCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
remove_sudo,
|
||||
filters=command(BotCommands.RmSudoCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
send_bot_settings,
|
||||
filters=command(BotCommands.BotSetCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(
|
||||
edit_bot_settings, filters=regex("^botset") & CustomFilters.sudo
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
cancel,
|
||||
filters=command(BotCommands.CancelTaskCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
cancel_all_buttons,
|
||||
filters=command(BotCommands.CancelAllCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(cancel_all_update, filters=regex("^canall"))
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(cancel_multi, filters=regex("^stopm"))
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
clone_node,
|
||||
filters=command(BotCommands.CloneCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
aioexecute,
|
||||
filters=command(BotCommands.AExecCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
execute,
|
||||
filters=command(BotCommands.ExecCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
clear,
|
||||
filters=command(BotCommands.ClearLocalsCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
select,
|
||||
filters=command(BotCommands.SelectCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(confirm_selection, filters=regex("^sel"))
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
remove_from_queue,
|
||||
filters=command(BotCommands.ForceStartCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
count_node,
|
||||
filters=command(BotCommands.CountCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
delete_file,
|
||||
filters=command(BotCommands.DeleteCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
gdrive_search,
|
||||
filters=command(BotCommands.ListCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(select_type, filters=regex("^list_types"))
|
||||
)
|
||||
TgClient.bot.add_handler(CallbackQueryHandler(arg_usage, filters=regex("^help")))
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
mirror,
|
||||
filters=command(BotCommands.MirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
qb_mirror,
|
||||
filters=command(BotCommands.QbMirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
jd_mirror,
|
||||
filters=command(BotCommands.JdMirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
nzb_mirror,
|
||||
filters=command(BotCommands.NzbMirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
leech,
|
||||
filters=command(BotCommands.LeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
qb_leech,
|
||||
filters=command(BotCommands.QbLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
jd_leech,
|
||||
filters=command(BotCommands.JdLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
nzb_leech,
|
||||
filters=command(BotCommands.NzbLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
get_rss_menu,
|
||||
filters=command(BotCommands.RssCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(CallbackQueryHandler(rss_listener, filters=regex("^rss")))
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
run_shell,
|
||||
filters=command(BotCommands.ShellCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
EditedMessageHandler(
|
||||
run_shell,
|
||||
filters=command(BotCommands.ShellCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
start, filters=command(BotCommands.StartCommand, case_sensitive=True)
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
log,
|
||||
filters=command(BotCommands.LogCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
restart_bot,
|
||||
filters=command(BotCommands.RestartCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
ping,
|
||||
filters=command(BotCommands.PingCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
bot_help,
|
||||
filters=command(BotCommands.HelpCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
bot_stats,
|
||||
filters=command(BotCommands.StatsCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
task_status,
|
||||
filters=command(BotCommands.StatusCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(status_pages, filters=regex("^status"))
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
torrent_search,
|
||||
filters=command(BotCommands.SearchCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(torrent_search_update, filters=regex("^torser"))
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
get_users_settings,
|
||||
filters=command(BotCommands.UsersCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
send_user_settings,
|
||||
filters=command(BotCommands.UserSetCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
CallbackQueryHandler(edit_user_settings, filters=regex("^userset"))
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
ytdl,
|
||||
filters=command(BotCommands.YtdlCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
TgClient.bot.add_handler(
|
||||
MessageHandler(
|
||||
ytdl_leech,
|
||||
filters=command(BotCommands.YtdlLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
71
bot/core/mltb_client.py
Normal file
71
bot/core/mltb_client.py
Normal file
@ -0,0 +1,71 @@
|
||||
from pyrogram import Client, enums
|
||||
from asyncio import Lock
|
||||
|
||||
from .. import LOGGER
|
||||
from .config_manager import Config
|
||||
|
||||
|
||||
class TgClient:
|
||||
_lock = Lock()
|
||||
bot = None
|
||||
user = None
|
||||
NAME = ""
|
||||
ID = 0
|
||||
IS_PREMIUM_USER = False
|
||||
MAX_SPLIT_SIZE = 2097152000
|
||||
|
||||
@classmethod
|
||||
async def start_bot(cls):
|
||||
LOGGER.info("Creating client from BOT_TOKEN")
|
||||
cls.bot = Client(
|
||||
"bot",
|
||||
Config.TELEGRAM_API,
|
||||
Config.TELEGRAM_HASH,
|
||||
bot_token=Config.BOT_TOKEN,
|
||||
parse_mode=enums.ParseMode.HTML,
|
||||
sleep_threshold=60,
|
||||
max_concurrent_transmissions=10,
|
||||
)
|
||||
await cls.bot.start()
|
||||
cls.NAME = cls.bot.me.username
|
||||
cls.ID = Config.BOT_TOKEN.split(":", 1)[0]
|
||||
|
||||
@classmethod
|
||||
async def start_user(cls):
|
||||
if Config.USER_SESSION_STRING:
|
||||
LOGGER.info("Creating client from USER_SESSION_STRING")
|
||||
try:
|
||||
cls.user = Client(
|
||||
"user",
|
||||
Config.TELEGRAM_API,
|
||||
Config.TELEGRAM_HASH,
|
||||
session_string=Config.USER_SESSION_STRING,
|
||||
parse_mode=enums.ParseMode.HTML,
|
||||
sleep_threshold=60,
|
||||
max_concurrent_transmissions=10,
|
||||
)
|
||||
await cls.user.start()
|
||||
cls.IS_PREMIUM_USER = cls.user.me.is_premium
|
||||
if cls.IS_PREMIUM_USER:
|
||||
cls.MAX_SPLIT_SIZE = 4194304000
|
||||
except Exception as e:
|
||||
LOGGER.error(f"Failed to start client from USER_SESSION_STRING. {e}")
|
||||
cls.IS_PREMIUM_USER = False
|
||||
cls.user = None
|
||||
|
||||
@classmethod
|
||||
async def stop(cls):
|
||||
async with cls._lock:
|
||||
if cls.bot:
|
||||
await cls.bot.stop()
|
||||
if cls.user:
|
||||
await cls.user.stop()
|
||||
LOGGER.info("Client stopped")
|
||||
|
||||
@classmethod
|
||||
async def reload(cls):
|
||||
async with cls._lock:
|
||||
await cls.bot.restart()
|
||||
if cls.user:
|
||||
await cls.user.restart()
|
||||
LOGGER.info("Client restarted")
|
253
bot/core/startup.py
Normal file
253
bot/core/startup.py
Normal file
@ -0,0 +1,253 @@
|
||||
from aiofiles.os import path as aiopath, remove, makedirs
|
||||
from aiofiles import open as aiopen
|
||||
from aioshutil import rmtree
|
||||
from asyncio import create_subprocess_exec, create_subprocess_shell
|
||||
|
||||
from .. import (
|
||||
aria2_options,
|
||||
qbit_options,
|
||||
nzb_options,
|
||||
drives_ids,
|
||||
drives_names,
|
||||
index_urls,
|
||||
user_data,
|
||||
extension_filter,
|
||||
LOGGER,
|
||||
rss_dict,
|
||||
qbittorrent_client,
|
||||
sabnzbd_client,
|
||||
aria2,
|
||||
)
|
||||
from ..helper.ext_utils.db_handler import database
|
||||
from .config_manager import Config
|
||||
from .mltb_client import TgClient
|
||||
|
||||
|
||||
def update_qb_options():
|
||||
if not qbit_options:
|
||||
qbit_options.update(dict(qbittorrent_client.app_preferences()))
|
||||
del qbit_options["listen_port"]
|
||||
for k in list(qbit_options.keys()):
|
||||
if k.startswith("rss"):
|
||||
del qbit_options[k]
|
||||
qbit_options["web_ui_password"] = "mltbmltb"
|
||||
qbittorrent_client.app_set_preferences({"web_ui_password": "mltbmltb"})
|
||||
else:
|
||||
qbittorrent_client.app_set_preferences(qbit_options)
|
||||
|
||||
|
||||
def update_aria2_options():
|
||||
if not aria2_options:
|
||||
aria2_options.update(aria2.client.get_global_option())
|
||||
else:
|
||||
aria2.set_global_options(aria2_options)
|
||||
|
||||
|
||||
async def update_nzb_options():
|
||||
no = (await sabnzbd_client.get_config())["config"]["misc"]
|
||||
nzb_options.update(no)
|
||||
|
||||
|
||||
async def load_settings():
|
||||
if not Config.DATABASE_URL:
|
||||
return
|
||||
await database.connect()
|
||||
if database.db is not None:
|
||||
BOT_ID = Config.BOT_TOKEN.split(":", 1)[0]
|
||||
config_file = Config.get_all()
|
||||
old_config = await database.db.settings.deployConfig.find_one({"_id": BOT_ID})
|
||||
if old_config is None:
|
||||
database.db.settings.deployConfig.replace_one(
|
||||
{"_id": BOT_ID}, config_file, upsert=True
|
||||
)
|
||||
else:
|
||||
del old_config["_id"]
|
||||
if old_config and old_config != config_file:
|
||||
await database.db.settings.deployConfig.replace_one(
|
||||
{"_id": BOT_ID}, config_file, upsert=True
|
||||
)
|
||||
else:
|
||||
config_dict = await database.db.settings.config.find_one(
|
||||
{"_id": BOT_ID}, {"_id": 0}
|
||||
)
|
||||
if config_dict:
|
||||
Config.load_dict(config_dict)
|
||||
|
||||
if pf_dict := await database.db.settings.files.find_one(
|
||||
{"_id": BOT_ID}, {"_id": 0}
|
||||
):
|
||||
for key, value in pf_dict.items():
|
||||
if value:
|
||||
file_ = key.replace("__", ".")
|
||||
async with aiopen(file_, "wb+") as f:
|
||||
await f.write(value)
|
||||
|
||||
if a2c_options := await database.db.settings.aria2c.find_one(
|
||||
{"_id": BOT_ID}, {"_id": 0}
|
||||
):
|
||||
aria2_options.update(a2c_options)
|
||||
|
||||
if qbit_opt := await database.db.settings.qbittorrent.find_one(
|
||||
{"_id": BOT_ID}, {"_id": 0}
|
||||
):
|
||||
qbit_options.update(qbit_opt)
|
||||
|
||||
if nzb_opt := await database.db.settings.nzb.find_one(
|
||||
{"_id": BOT_ID}, {"_id": 0}
|
||||
):
|
||||
if await aiopath.exists("sabnzbd/SABnzbd.ini.bak"):
|
||||
await remove("sabnzbd/SABnzbd.ini.bak")
|
||||
((key, value),) = nzb_opt.items()
|
||||
file_ = key.replace("__", ".")
|
||||
async with aiopen(f"sabnzbd/{file_}", "wb+") as f:
|
||||
await f.write(value)
|
||||
|
||||
if await database.db.users.find_one():
|
||||
rows = database.db.users.find({})
|
||||
async for row in rows:
|
||||
uid = row["_id"]
|
||||
del row["_id"]
|
||||
thumb_path = f"Thumbnails/{uid}.jpg"
|
||||
rclone_config_path = f"rclone/{uid}.conf"
|
||||
token_path = f"tokens/{uid}.pickle"
|
||||
if row.get("thumb"):
|
||||
if not await aiopath.exists("Thumbnails"):
|
||||
await makedirs("Thumbnails")
|
||||
async with aiopen(thumb_path, "wb+") as f:
|
||||
await f.write(row["thumb"])
|
||||
row["thumb"] = thumb_path
|
||||
if row.get("rclone_config"):
|
||||
if not await aiopath.exists("rclone"):
|
||||
await makedirs("rclone")
|
||||
async with aiopen(rclone_config_path, "wb+") as f:
|
||||
await f.write(row["rclone_config"])
|
||||
row["rclone_config"] = rclone_config_path
|
||||
if row.get("token_pickle"):
|
||||
if not await aiopath.exists("tokens"):
|
||||
await makedirs("tokens")
|
||||
async with aiopen(token_path, "wb+") as f:
|
||||
await f.write(row["token_pickle"])
|
||||
row["token_pickle"] = token_path
|
||||
user_data[uid] = row
|
||||
LOGGER.info("Users data has been imported from Database")
|
||||
|
||||
if await database.db.rss[BOT_ID].find_one():
|
||||
rows = database.db.rss[BOT_ID].find({})
|
||||
async for row in rows:
|
||||
user_id = row["_id"]
|
||||
del row["_id"]
|
||||
rss_dict[user_id] = row
|
||||
LOGGER.info("Rss data has been imported from Database.")
|
||||
|
||||
|
||||
async def save_settings():
|
||||
if database.db is None:
|
||||
return
|
||||
config_dict = Config.get_all()
|
||||
await database.db.settings.config.replace_one(
|
||||
{"_id": TgClient.ID}, config_dict, upsert=True
|
||||
)
|
||||
if await database.db.settings.aria2c.find_one({"_id": TgClient.ID}) is None:
|
||||
await database.db.settings.aria2c.update_one(
|
||||
{"_id": TgClient.ID}, {"$set": aria2_options}, upsert=True
|
||||
)
|
||||
if await database.db.settings.qbittorrent.find_one({"_id": TgClient.ID}) is None:
|
||||
await database.save_qbit_settings()
|
||||
if await database.db.settings.nzb.find_one({"_id": TgClient.ID}) is None:
|
||||
async with aiopen("sabnzbd/SABnzbd.ini", "rb+") as pf:
|
||||
nzb_conf = await pf.read()
|
||||
await database.db.settings.nzb.update_one(
|
||||
{"_id": TgClient.ID}, {"$set": {"SABnzbd__ini": nzb_conf}}, upsert=True
|
||||
)
|
||||
|
||||
|
||||
async def update_variables():
|
||||
if (
|
||||
Config.LEECH_SPLIT_SIZE > TgClient.MAX_SPLIT_SIZE
|
||||
or Config.LEECH_SPLIT_SIZE == 2097152000 or not Config.LEECH_SPLIT_SIZE
|
||||
):
|
||||
Config.LEECH_SPLIT_SIZE = TgClient.MAX_SPLIT_SIZE
|
||||
|
||||
Config.MIXED_LEECH = bool(Config.MIXED_LEECH and TgClient.IS_PREMIUM_USER)
|
||||
Config.USER_TRANSMISSION = bool(
|
||||
Config.USER_TRANSMISSION and TgClient.IS_PREMIUM_USER
|
||||
)
|
||||
|
||||
if Config.AUTHORIZED_CHATS:
|
||||
aid = Config.AUTHORIZED_CHATS.split()
|
||||
for id_ in aid:
|
||||
chat_id, *thread_ids = id_.split("|")
|
||||
chat_id = int(chat_id.strip())
|
||||
if thread_ids:
|
||||
thread_ids = list(map(lambda x: int(x.strip()), thread_ids))
|
||||
user_data[chat_id] = {"is_auth": True, "thread_ids": thread_ids}
|
||||
else:
|
||||
user_data[chat_id] = {"is_auth": True}
|
||||
|
||||
if Config.SUDO_USERS:
|
||||
aid = Config.SUDO_USERS.split()
|
||||
for id_ in aid:
|
||||
user_data[int(id_.strip())] = {"is_sudo": True}
|
||||
|
||||
if Config.EXTENSION_FILTER:
|
||||
fx = Config.EXTENSION_FILTER.split()
|
||||
for x in fx:
|
||||
x = x.lstrip(".")
|
||||
extension_filter.append(x.strip().lower())
|
||||
|
||||
if Config.GDRIVE_ID:
|
||||
drives_names.append("Main")
|
||||
drives_ids.append(Config.GDRIVE_ID)
|
||||
index_urls.append(Config.INDEX_URL)
|
||||
|
||||
if await aiopath.exists("list_drives.txt"):
|
||||
async with aiopen("list_drives.txt", "r+") as f:
|
||||
lines = f.readlines()
|
||||
for line in lines:
|
||||
temp = line.strip().split()
|
||||
drives_ids.append(temp[1])
|
||||
drives_names.append(temp[0].replace("_", " "))
|
||||
if len(temp) > 2:
|
||||
index_urls.append(temp[2])
|
||||
else:
|
||||
index_urls.append("")
|
||||
|
||||
if not await aiopath.exists("accounts"):
|
||||
Config.USE_SERVICE_ACCOUNTS = False
|
||||
|
||||
|
||||
async def load_configurations():
|
||||
|
||||
if not await aiopath.exists(".netrc"):
|
||||
async with aiopen(".netrc", "w"):
|
||||
pass
|
||||
|
||||
await (
|
||||
await create_subprocess_shell(
|
||||
"chmod 600 .netrc && cp .netrc /root/.netrc && chmod +x aria-nox-nzb.sh && ./aria-nox-nzb.sh"
|
||||
)
|
||||
).wait()
|
||||
|
||||
if Config.BASE_URL:
|
||||
await create_subprocess_shell(
|
||||
f"gunicorn web.wserver:app --bind 0.0.0.0:{Config.BASE_URL_PORT} --worker-class gevent"
|
||||
)
|
||||
|
||||
if await aiopath.exists("cfg.zip"):
|
||||
if await aiopath.exists("/JDownloader/cfg"):
|
||||
await rmtree("/JDownloader/cfg", ignore_errors=True)
|
||||
await (
|
||||
await create_subprocess_exec("7z", "x", "cfg.zip", "-o/JDownloader")
|
||||
).wait()
|
||||
await remove("cfg.zip")
|
||||
|
||||
if await aiopath.exists("accounts.zip"):
|
||||
if await aiopath.exists("accounts"):
|
||||
await rmtree("accounts")
|
||||
await (
|
||||
await create_subprocess_exec(
|
||||
"7z", "x", "-o.", "-aoa", "accounts.zip", "accounts/*.json"
|
||||
)
|
||||
).wait()
|
||||
await (await create_subprocess_exec("chmod", "-R", "777", "accounts")).wait()
|
||||
await remove("accounts.zip")
|
@ -7,22 +7,19 @@ from aioshutil import move, copy2
|
||||
from pyrogram.enums import ChatAction
|
||||
from re import sub, I
|
||||
|
||||
from bot import (
|
||||
DOWNLOAD_DIR,
|
||||
MAX_SPLIT_SIZE,
|
||||
config_dict,
|
||||
from .. import (
|
||||
user_data,
|
||||
IS_PREMIUM_USER,
|
||||
user,
|
||||
multi_tags,
|
||||
LOGGER,
|
||||
task_dict_lock,
|
||||
task_dict,
|
||||
global_extension_filter,
|
||||
extension_filter,
|
||||
cpu_eater_lock,
|
||||
subprocess_lock,
|
||||
intervals,
|
||||
)
|
||||
from ..core.config_manager import Config
|
||||
from ..core.mltb_client import TgClient
|
||||
from .ext_utils.bot_utils import new_task, sync_to_async, get_size_bytes
|
||||
from .ext_utils.bulk_links import extract_bulk_links
|
||||
from .ext_utils.exceptions import NotSupportedExtractionArchive
|
||||
@ -70,7 +67,7 @@ class TaskConfig:
|
||||
self.user = self.message.from_user or self.message.sender_chat
|
||||
self.user_id = self.user.id
|
||||
self.user_dict = user_data.get(self.user_id, {})
|
||||
self.dir = f"{DOWNLOAD_DIR}{self.mid}"
|
||||
self.dir = f"{Config.DOWNLOAD_DIR}{self.mid}"
|
||||
self.link = ""
|
||||
self.up_dest = ""
|
||||
self.rc_flags = ""
|
||||
@ -125,7 +122,7 @@ class TaskConfig:
|
||||
return f"tokens/{self.user_id}.pickle"
|
||||
elif (
|
||||
dest.startswith("sa:")
|
||||
or config_dict["USE_SERVICE_ACCOUNTS"]
|
||||
or Config.USE_SERVICE_ACCOUNTS
|
||||
and not dest.startswith("tp:")
|
||||
):
|
||||
return "accounts"
|
||||
@ -160,17 +157,13 @@ class TaskConfig:
|
||||
self.name_sub = (
|
||||
self.name_sub
|
||||
or self.user_dict.get("name_sub", False)
|
||||
or (
|
||||
config_dict["NAME_SUBSTITUTE"]
|
||||
if "name_sub" not in self.user_dict
|
||||
else ""
|
||||
)
|
||||
or (Config.NAME_SUBSTITUTE if "name_sub" not in self.user_dict else "")
|
||||
)
|
||||
if self.name_sub:
|
||||
self.name_sub = [x.split("/") for x in self.name_sub.split(" | ")]
|
||||
self.seed = False
|
||||
self.extension_filter = self.user_dict.get("excluded_extensions") or (
|
||||
global_extension_filter
|
||||
extension_filter
|
||||
if "excluded_extensions" not in self.user_dict
|
||||
else ["aria2", "!qB"]
|
||||
)
|
||||
@ -199,9 +192,9 @@ class TaskConfig:
|
||||
if not is_gdrive_id(self.link):
|
||||
raise ValueError(self.link)
|
||||
|
||||
self.user_transmission = IS_PREMIUM_USER and (
|
||||
self.user_transmission = TgClient.IS_PREMIUM_USER and (
|
||||
self.user_dict.get("user_transmission")
|
||||
or config_dict["USER_TRANSMISSION"]
|
||||
or Config.USER_TRANSMISSION
|
||||
and "user_transmission" not in self.user_dict
|
||||
)
|
||||
|
||||
@ -215,11 +208,7 @@ class TaskConfig:
|
||||
self.ffmpeg_cmds = (
|
||||
self.ffmpeg_cmds
|
||||
or self.user_dict.get("ffmpeg_cmds", None)
|
||||
or (
|
||||
config_dict["FFMPEG_CMDS"]
|
||||
if "ffmpeg_cmds" not in self.user_dict
|
||||
else None
|
||||
)
|
||||
or (Config.FFMPEG_CMDS if "ffmpeg_cmds" not in self.user_dict else None)
|
||||
)
|
||||
if self.ffmpeg_cmds:
|
||||
self.seed = False
|
||||
@ -228,20 +217,15 @@ class TaskConfig:
|
||||
self.stop_duplicate = (
|
||||
self.user_dict.get("stop_duplicate")
|
||||
or "stop_duplicate" not in self.user_dict
|
||||
and config_dict["STOP_DUPLICATE"]
|
||||
and Config.STOP_DUPLICATE
|
||||
)
|
||||
default_upload = (
|
||||
self.user_dict.get("default_upload", "")
|
||||
or config_dict["DEFAULT_UPLOAD"]
|
||||
self.user_dict.get("default_upload", "") or Config.DEFAULT_UPLOAD
|
||||
)
|
||||
if (not self.up_dest and default_upload == "rc") or self.up_dest == "rc":
|
||||
self.up_dest = (
|
||||
self.user_dict.get("rclone_path") or config_dict["RCLONE_PATH"]
|
||||
)
|
||||
self.up_dest = self.user_dict.get("rclone_path") or Config.RCLONE_PATH
|
||||
elif (not self.up_dest and default_upload == "gd") or self.up_dest == "gd":
|
||||
self.up_dest = (
|
||||
self.user_dict.get("gdrive_id") or config_dict["GDRIVE_ID"]
|
||||
)
|
||||
self.up_dest = self.user_dict.get("gdrive_id") or Config.GDRIVE_ID
|
||||
if not self.up_dest:
|
||||
raise ValueError("No Upload Destination!")
|
||||
if is_gdrive_id(self.up_dest):
|
||||
@ -301,11 +285,11 @@ class TaskConfig:
|
||||
self.up_dest = (
|
||||
self.up_dest
|
||||
or self.user_dict.get("leech_dest")
|
||||
or config_dict["LEECH_DUMP_CHAT"]
|
||||
or Config.LEECH_DUMP_CHAT
|
||||
)
|
||||
self.mixed_leech = IS_PREMIUM_USER and (
|
||||
self.mixed_leech = TgClient.IS_PREMIUM_USER and (
|
||||
self.user_dict.get("mixed_leech")
|
||||
or config_dict["MIXED_LEECH"]
|
||||
or Config.MIXED_LEECH
|
||||
and "mixed_leech" not in self.user_dict
|
||||
)
|
||||
if self.up_dest:
|
||||
@ -316,9 +300,9 @@ class TaskConfig:
|
||||
self.mixed_leech = False
|
||||
elif self.up_dest.startswith("u:"):
|
||||
self.up_dest = self.up_dest.replace("u:", "", 1)
|
||||
self.user_transmission = IS_PREMIUM_USER
|
||||
self.user_transmission = TgClient.IS_PREMIUM_USER
|
||||
elif self.up_dest.startswith("m:"):
|
||||
self.user_transmission = IS_PREMIUM_USER
|
||||
self.user_transmission = TgClient.IS_PREMIUM_USER
|
||||
self.mixed_leech = self.user_transmission
|
||||
if "|" in self.up_dest:
|
||||
self.up_dest, self.chat_thread_id = list(
|
||||
@ -333,32 +317,58 @@ class TaskConfig:
|
||||
self.up_dest = self.user_id
|
||||
|
||||
if self.user_transmission:
|
||||
chat = await user.get_chat(self.up_dest)
|
||||
uploader_id = user.me.id
|
||||
else:
|
||||
chat = await self.client.get_chat(self.up_dest)
|
||||
uploader_id = self.client.me.id
|
||||
|
||||
if chat.type.name in ["SUPERGROUP", "CHANNEL"]:
|
||||
member = await chat.get_member(uploader_id)
|
||||
if (
|
||||
not member.privileges.can_manage_chat
|
||||
or not member.privileges.can_delete_messages
|
||||
):
|
||||
raise ValueError(
|
||||
"You don't have enough privileges in this chat!"
|
||||
)
|
||||
elif self.user_transmission:
|
||||
raise ValueError(
|
||||
"Custom Leech Destination only allowed for super-group or channel when UserTransmission enalbed!\nDisable UserTransmission so bot can send files to user!"
|
||||
)
|
||||
else:
|
||||
try:
|
||||
await self.client.send_chat_action(
|
||||
self.up_dest, ChatAction.TYPING
|
||||
)
|
||||
chat = await TgClient.user.get_chat(self.up_dest)
|
||||
except:
|
||||
raise ValueError("Start the bot and try again!")
|
||||
chat = None
|
||||
if chat is None:
|
||||
self.user_transmission = False
|
||||
self.mixed_leech = False
|
||||
else:
|
||||
uploader_id = TgClient.user.me.id
|
||||
if chat.type.name not in ["SUPERGROUP", "CHANNEL", "GROUP"]:
|
||||
self.user_transmission = False
|
||||
self.mixed_leech = False
|
||||
else:
|
||||
member = await chat.get_member(uploader_id)
|
||||
if (
|
||||
not member.privileges.can_manage_chat
|
||||
or not member.privileges.can_delete_messages
|
||||
):
|
||||
self.user_transmission = False
|
||||
self.mixed_leech = False
|
||||
|
||||
if not self.user_transmission or self.mixed_leech:
|
||||
try:
|
||||
chat = await self.client.get_chat(self.up_dest)
|
||||
except:
|
||||
chat = None
|
||||
if chat is None:
|
||||
if self.user_transmission:
|
||||
self.mixed_leech = False
|
||||
else:
|
||||
raise ValueError("Chat not found!")
|
||||
else:
|
||||
uploader_id = self.client.me.id
|
||||
if chat.type.name in ["SUPERGROUP", "CHANNEL", "GROUP"]:
|
||||
member = await chat.get_member(uploader_id)
|
||||
if (
|
||||
not member.privileges.can_manage_chat
|
||||
or not member.privileges.can_delete_messages
|
||||
):
|
||||
if not self.user_transmission:
|
||||
raise ValueError(
|
||||
"You don't have enough privileges in this chat!"
|
||||
)
|
||||
else:
|
||||
self.mixed_leech = False
|
||||
else:
|
||||
try:
|
||||
await self.client.send_chat_action(
|
||||
self.up_dest, ChatAction.TYPING
|
||||
)
|
||||
except:
|
||||
raise ValueError("Start the bot and try again!")
|
||||
elif (
|
||||
self.user_transmission or self.mixed_leech
|
||||
) and not self.is_super_chat:
|
||||
@ -372,15 +382,15 @@ class TaskConfig:
|
||||
self.split_size = (
|
||||
self.split_size
|
||||
or self.user_dict.get("split_size")
|
||||
or config_dict["LEECH_SPLIT_SIZE"]
|
||||
or Config.LEECH_SPLIT_SIZE
|
||||
)
|
||||
self.equal_splits = (
|
||||
self.user_dict.get("equal_splits")
|
||||
or config_dict["EQUAL_SPLITS"]
|
||||
or Config.EQUAL_SPLITS
|
||||
and "equal_splits" not in self.user_dict
|
||||
)
|
||||
self.max_split_size = (
|
||||
MAX_SPLIT_SIZE if self.user_transmission else 2097152000
|
||||
TgClient.MAX_SPLIT_SIZE if self.user_transmission else 2097152000
|
||||
)
|
||||
self.split_size = min(self.split_size, self.max_split_size)
|
||||
|
||||
@ -390,7 +400,7 @@ class TaskConfig:
|
||||
if self.as_med
|
||||
else (
|
||||
self.user_dict.get("as_doc", False)
|
||||
or config_dict["AS_DOCUMENT"]
|
||||
or Config.AS_DOCUMENT
|
||||
and "as_doc" not in self.user_dict
|
||||
)
|
||||
)
|
||||
@ -399,7 +409,7 @@ class TaskConfig:
|
||||
self.thumbnail_layout
|
||||
or self.user_dict.get("thumb_layout", False)
|
||||
or (
|
||||
config_dict["THUMBNAIL_LAYOUT"]
|
||||
Config.THUMBNAIL_LAYOUT
|
||||
if "thumb_layout" not in self.user_dict
|
||||
else ""
|
||||
)
|
||||
@ -1124,7 +1134,11 @@ class TaskConfig:
|
||||
|
||||
async def proceed_ffmpeg(self, dl_path, gid):
|
||||
checked = False
|
||||
for ffmpeg_cmd in self.ffmpeg_cmds:
|
||||
cmds = [
|
||||
[part.strip() for part in item.split() if part.strip()]
|
||||
for item in self.ffmpeg_cmds
|
||||
]
|
||||
for ffmpeg_cmd in cmds:
|
||||
cmd = [
|
||||
"ffmpeg",
|
||||
"-hide_banner",
|
||||
|
@ -9,7 +9,8 @@ from asyncio import (
|
||||
sleep,
|
||||
)
|
||||
|
||||
from bot import user_data, config_dict, bot_loop
|
||||
from ... import user_data, bot_loop
|
||||
from ...core.config_manager import Config
|
||||
from ..telegram_helper.button_build import ButtonMaker
|
||||
from .telegraph_helper import telegraph
|
||||
from .help_messages import (
|
||||
@ -57,12 +58,13 @@ def bt_selection_buttons(id_):
|
||||
gid = id_[:12] if len(id_) > 25 else id_
|
||||
pin = "".join([n for n in id_ if n.isdigit()][:4])
|
||||
buttons = ButtonMaker()
|
||||
BASE_URL = config_dict["BASE_URL"]
|
||||
if config_dict["WEB_PINCODE"]:
|
||||
buttons.url_button("Select Files", f"{BASE_URL}/app/files?gid={id_}")
|
||||
if Config.WEB_PINCODE:
|
||||
buttons.url_button("Select Files", f"{Config.BASE_URL}/app/files?gid={id_}")
|
||||
buttons.data_button("Pincode", f"sel pin {gid} {pin}")
|
||||
else:
|
||||
buttons.url_button("Select Files", f"{BASE_URL}/app/files?gid={id_}&pin={pin}")
|
||||
buttons.url_button(
|
||||
"Select Files", f"{Config.BASE_URL}/app/files?gid={id_}&pin={pin}"
|
||||
)
|
||||
buttons.data_button("Done Selecting", f"sel done {gid} {id_}")
|
||||
buttons.data_button("Cancel", f"sel cancel {gid}")
|
||||
return buttons.build_menu(2)
|
||||
|
@ -1,151 +1,81 @@
|
||||
from aiofiles import open as aiopen
|
||||
from aiofiles.os import path as aiopath, makedirs
|
||||
from dotenv import dotenv_values
|
||||
from aiofiles.os import path as aiopath
|
||||
from importlib import import_module
|
||||
from motor.motor_asyncio import AsyncIOMotorClient
|
||||
from pymongo.server_api import ServerApi
|
||||
from pymongo.errors import PyMongoError
|
||||
|
||||
from bot import (
|
||||
user_data,
|
||||
rss_dict,
|
||||
LOGGER,
|
||||
BOT_ID,
|
||||
config_dict,
|
||||
aria2_options,
|
||||
qbit_options,
|
||||
)
|
||||
from ... import LOGGER, user_data, rss_dict, qbit_options
|
||||
from ...core.mltb_client import TgClient
|
||||
from ...core.config_manager import Config
|
||||
|
||||
|
||||
class DbManager:
|
||||
def __init__(self):
|
||||
self._return = False
|
||||
self._db = None
|
||||
self._return = True
|
||||
self._conn = None
|
||||
self.db = None
|
||||
|
||||
async def connect(self):
|
||||
try:
|
||||
if config_dict["DATABASE_URL"]:
|
||||
if self._conn is not None:
|
||||
await self._conn.close()
|
||||
self._conn = AsyncIOMotorClient(
|
||||
config_dict["DATABASE_URL"], server_api=ServerApi("1")
|
||||
)
|
||||
self._db = self._conn.mltb
|
||||
self._return = False
|
||||
else:
|
||||
self._return = True
|
||||
if self._conn is not None:
|
||||
await self._conn.close()
|
||||
self._conn = AsyncIOMotorClient(
|
||||
Config.DATABASE_URL, server_api=ServerApi("1")
|
||||
)
|
||||
self.db = self._conn.mltb
|
||||
self._return = False
|
||||
except PyMongoError as e:
|
||||
LOGGER.error(f"Error in DB connection: {e}")
|
||||
self.db = None
|
||||
self._return = True
|
||||
self._conn = None
|
||||
|
||||
async def disconnect(self):
|
||||
self._return = True
|
||||
if self._conn is not None:
|
||||
await self._conn.close()
|
||||
self._conn = None
|
||||
self._return = True
|
||||
|
||||
async def db_load(self):
|
||||
if self._db is None:
|
||||
await self.connect()
|
||||
if self._return:
|
||||
return
|
||||
# Save bot settings
|
||||
try:
|
||||
await self._db.settings.config.replace_one(
|
||||
{"_id": BOT_ID}, config_dict, upsert=True
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(f"DataBase Collection Error: {e}")
|
||||
return
|
||||
# Save Aria2c options
|
||||
if await self._db.settings.aria2c.find_one({"_id": BOT_ID}) is None:
|
||||
await self._db.settings.aria2c.update_one(
|
||||
{"_id": BOT_ID}, {"$set": aria2_options}, upsert=True
|
||||
)
|
||||
# Save qbittorrent options
|
||||
if await self._db.settings.qbittorrent.find_one({"_id": BOT_ID}) is None:
|
||||
await self.save_qbit_settings()
|
||||
# Save nzb config
|
||||
if await self._db.settings.nzb.find_one({"_id": BOT_ID}) is None:
|
||||
async with aiopen("sabnzbd/SABnzbd.ini", "rb+") as pf:
|
||||
nzb_conf = await pf.read()
|
||||
await self._db.settings.nzb.update_one(
|
||||
{"_id": BOT_ID}, {"$set": {"SABnzbd__ini": nzb_conf}}, upsert=True
|
||||
)
|
||||
# User Data
|
||||
if await self._db.users.find_one():
|
||||
rows = self._db.users.find({})
|
||||
# return a dict ==> {_id, is_sudo, is_auth, as_doc, thumb, yt_opt, media_group, equal_splits, split_size, rclone, rclone_path, token_pickle, gdrive_id, leech_dest, lperfix, lprefix, excluded_extensions, user_transmission, index_url, default_upload}
|
||||
async for row in rows:
|
||||
uid = row["_id"]
|
||||
del row["_id"]
|
||||
thumb_path = f"Thumbnails/{uid}.jpg"
|
||||
rclone_config_path = f"rclone/{uid}.conf"
|
||||
token_path = f"tokens/{uid}.pickle"
|
||||
if row.get("thumb"):
|
||||
if not await aiopath.exists("Thumbnails"):
|
||||
await makedirs("Thumbnails")
|
||||
async with aiopen(thumb_path, "wb+") as f:
|
||||
await f.write(row["thumb"])
|
||||
row["thumb"] = thumb_path
|
||||
if row.get("rclone_config"):
|
||||
if not await aiopath.exists("rclone"):
|
||||
await makedirs("rclone")
|
||||
async with aiopen(rclone_config_path, "wb+") as f:
|
||||
await f.write(row["rclone_config"])
|
||||
row["rclone_config"] = rclone_config_path
|
||||
if row.get("token_pickle"):
|
||||
if not await aiopath.exists("tokens"):
|
||||
await makedirs("tokens")
|
||||
async with aiopen(token_path, "wb+") as f:
|
||||
await f.write(row["token_pickle"])
|
||||
row["token_pickle"] = token_path
|
||||
user_data[uid] = row
|
||||
LOGGER.info("Users data has been imported from Database")
|
||||
# Rss Data
|
||||
if await self._db.rss[BOT_ID].find_one():
|
||||
# return a dict ==> {_id, title: {link, last_feed, last_name, inf, exf, command, paused}
|
||||
rows = self._db.rss[BOT_ID].find({})
|
||||
async for row in rows:
|
||||
user_id = row["_id"]
|
||||
del row["_id"]
|
||||
rss_dict[user_id] = row
|
||||
LOGGER.info("Rss data has been imported from Database.")
|
||||
|
||||
async def update_deploy_config(self):
|
||||
if self._return:
|
||||
return
|
||||
current_config = dict(dotenv_values("config.env"))
|
||||
await self._db.settings.deployConfig.replace_one(
|
||||
{"_id": BOT_ID}, current_config, upsert=True
|
||||
settings = import_module("config")
|
||||
config_file = {
|
||||
key: value.strip() if isinstance(value, str) else value
|
||||
for key, value in vars(settings).items()
|
||||
if not key.startswith("__")
|
||||
}
|
||||
await self.db.settings.deployConfig.replace_one(
|
||||
{"_id": TgClient.ID}, config_file, upsert=True
|
||||
)
|
||||
|
||||
async def update_config(self, dict_):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.settings.config.update_one(
|
||||
{"_id": BOT_ID}, {"$set": dict_}, upsert=True
|
||||
await self.db.settings.config.update_one(
|
||||
{"_id": TgClient.ID}, {"$set": dict_}, upsert=True
|
||||
)
|
||||
|
||||
async def update_aria2(self, key, value):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.settings.aria2c.update_one(
|
||||
{"_id": BOT_ID}, {"$set": {key: value}}, upsert=True
|
||||
await self.db.settings.aria2c.update_one(
|
||||
{"_id": TgClient.ID}, {"$set": {key: value}}, upsert=True
|
||||
)
|
||||
|
||||
async def update_qbittorrent(self, key, value):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.settings.qbittorrent.update_one(
|
||||
{"_id": BOT_ID}, {"$set": {key: value}}, upsert=True
|
||||
await self.db.settings.qbittorrent.update_one(
|
||||
{"_id": TgClient.ID}, {"$set": {key: value}}, upsert=True
|
||||
)
|
||||
|
||||
async def save_qbit_settings(self):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.settings.qbittorrent.replace_one(
|
||||
{"_id": BOT_ID}, qbit_options, upsert=True
|
||||
await self.db.settings.qbittorrent.replace_one(
|
||||
{"_id": TgClient.ID}, qbit_options, upsert=True
|
||||
)
|
||||
|
||||
async def update_private_file(self, path):
|
||||
@ -157,17 +87,19 @@ class DbManager:
|
||||
else:
|
||||
pf_bin = ""
|
||||
path = path.replace(".", "__")
|
||||
await self._db.settings.files.update_one(
|
||||
{"_id": BOT_ID}, {"$set": {path: pf_bin}}, upsert=True
|
||||
await self.db.settings.files.update_one(
|
||||
{"_id": TgClient.ID}, {"$set": {path: pf_bin}}, upsert=True
|
||||
)
|
||||
if path == "config.env":
|
||||
if path == "config.py":
|
||||
await self.update_deploy_config()
|
||||
|
||||
async def update_nzb_config(self):
|
||||
if self._return:
|
||||
return
|
||||
async with aiopen("sabnzbd/SABnzbd.ini", "rb+") as pf:
|
||||
nzb_conf = await pf.read()
|
||||
await self._db.settings.nzb.replace_one(
|
||||
{"_id": BOT_ID}, {"SABnzbd__ini": nzb_conf}, upsert=True
|
||||
await self.db.settings.nzb.replace_one(
|
||||
{"_id": TgClient.ID}, {"SABnzbd__ini": nzb_conf}, upsert=True
|
||||
)
|
||||
|
||||
async def update_user_data(self, user_id):
|
||||
@ -180,7 +112,7 @@ class DbManager:
|
||||
del data["rclone_config"]
|
||||
if data.get("token_pickle"):
|
||||
del data["token_pickle"]
|
||||
await self._db.users.replace_one({"_id": user_id}, data, upsert=True)
|
||||
await self.db.users.replace_one({"_id": user_id}, data, upsert=True)
|
||||
|
||||
async def update_user_doc(self, user_id, key, path=""):
|
||||
if self._return:
|
||||
@ -190,7 +122,7 @@ class DbManager:
|
||||
doc_bin = await doc.read()
|
||||
else:
|
||||
doc_bin = ""
|
||||
await self._db.users.update_one(
|
||||
await self.db.users.update_one(
|
||||
{"_id": user_id}, {"$set": {key: doc_bin}}, upsert=True
|
||||
)
|
||||
|
||||
@ -198,39 +130,40 @@ class DbManager:
|
||||
if self._return:
|
||||
return
|
||||
for user_id in list(rss_dict.keys()):
|
||||
await self._db.rss[BOT_ID].replace_one(
|
||||
await self.db.rss[TgClient.ID].replace_one(
|
||||
{"_id": user_id}, rss_dict[user_id], upsert=True
|
||||
)
|
||||
|
||||
async def rss_update(self, user_id):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.rss[BOT_ID].replace_one(
|
||||
await self.db.rss[TgClient.ID].replace_one(
|
||||
{"_id": user_id}, rss_dict[user_id], upsert=True
|
||||
)
|
||||
|
||||
async def rss_delete(self, user_id):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.rss[BOT_ID].delete_one({"_id": user_id})
|
||||
await self.db.rss[TgClient.ID].delete_one({"_id": user_id})
|
||||
|
||||
async def add_incomplete_task(self, cid, link, tag):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.tasks[BOT_ID].insert_one({"_id": link, "cid": cid, "tag": tag})
|
||||
await self.db.tasks[TgClient.ID].insert_one(
|
||||
{"_id": link, "cid": cid, "tag": tag}
|
||||
)
|
||||
|
||||
async def rm_complete_task(self, link):
|
||||
if self._return:
|
||||
return
|
||||
await self._db.tasks[BOT_ID].delete_one({"_id": link})
|
||||
await self.db.tasks[TgClient.ID].delete_one({"_id": link})
|
||||
|
||||
async def get_incomplete_tasks(self):
|
||||
notifier_dict = {}
|
||||
if self._return:
|
||||
return notifier_dict
|
||||
if await self._db.tasks[BOT_ID].find_one():
|
||||
# return a dict ==> {_id, cid, tag}
|
||||
rows = self._db.tasks[BOT_ID].find({})
|
||||
if await self.db.tasks[TgClient.ID].find_one():
|
||||
rows = self.db.tasks[TgClient.ID].find({})
|
||||
async for row in rows:
|
||||
if row["cid"] in list(notifier_dict.keys()):
|
||||
if row["tag"] in list(notifier_dict[row["cid"]]):
|
||||
@ -239,13 +172,13 @@ class DbManager:
|
||||
notifier_dict[row["cid"]][row["tag"]] = [row["_id"]]
|
||||
else:
|
||||
notifier_dict[row["cid"]] = {row["tag"]: [row["_id"]]}
|
||||
await self._db.tasks[BOT_ID].drop()
|
||||
return notifier_dict # return a dict ==> {cid: {tag: [_id, _id, ...]}}
|
||||
await self.db.tasks[TgClient.ID].drop()
|
||||
return notifier_dict
|
||||
|
||||
async def trunc_table(self, name):
|
||||
if self._return:
|
||||
return
|
||||
await self._db[name][BOT_ID].drop()
|
||||
await self.db[name][TgClient.ID].drop()
|
||||
|
||||
|
||||
database = DbManager()
|
||||
|
@ -7,7 +7,8 @@ from shutil import rmtree
|
||||
from subprocess import run as srun
|
||||
from sys import exit
|
||||
|
||||
from bot import aria2, LOGGER, DOWNLOAD_DIR, qbittorrent_client
|
||||
from ... import aria2, LOGGER, qbittorrent_client
|
||||
from ...core.config_manager import Config
|
||||
from .bot_utils import sync_to_async, cmd_exec
|
||||
from .exceptions import NotSupportedExtractionArchive
|
||||
|
||||
@ -96,10 +97,10 @@ def clean_all():
|
||||
qbittorrent_client.torrents_delete(torrent_hashes="all")
|
||||
try:
|
||||
LOGGER.info("Cleaning Download Directory")
|
||||
rmtree(DOWNLOAD_DIR, ignore_errors=True)
|
||||
rmtree(Config.DOWNLOAD_DIR, ignore_errors=True)
|
||||
except:
|
||||
pass
|
||||
makedirs(DOWNLOAD_DIR, exist_ok=True)
|
||||
makedirs(Config.DOWNLOAD_DIR, exist_ok=True)
|
||||
|
||||
|
||||
def exit_clean_up(signal, frame):
|
||||
|
@ -1,3 +1,5 @@
|
||||
from ..telegram_helper.bot_commands import BotCommands
|
||||
|
||||
mirror = """<b>Send link along with command line or </b>
|
||||
|
||||
/cmd link
|
||||
@ -239,8 +241,7 @@ list of lists of ffmpeg commands. You can set multiple ffmpeg commands for all f
|
||||
Notes:
|
||||
1. Add <code>-del</code> to the list(s) which you want from the bot to delete the original files after command run complete!
|
||||
2. Seed will get disbaled while using this option
|
||||
3. It must be list of list(s) event of one list added like [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv", "-del"]]
|
||||
Examples: [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv", "-del"], ["-i", "mltb.video", "-c", "copy", "-c:s", "srt", "mltb"], ["-i", "mltb.m4a", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"], ["-i", "mltb.audio", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"]]
|
||||
Examples: ["-i mltb.mkv -c copy -c:s srt mltb.mkv", "-i mltb.video -c copy -c:s srt mltb", "-i mltb.m4a -c:a libmp3lame -q:a 2 mltb.mp3", "-i mltb.audio -c:a libmp3lame -q:a 2 mltb.mp3"]
|
||||
Here I will explain how to use mltb.* which is reference to files you want to work on.
|
||||
1. First cmd: the input is mltb.mkv so this cmd will work only on mkv videos and the output is mltb.mkv also so all outputs is mkv. -del will delete the original media after complete run of the cmd.
|
||||
2. Second cmd: the input is mltb.video so this cmd will work on all videos and the output is only mltb so the extenstion is same as input files.
|
||||
@ -339,3 +340,44 @@ PASSWORD_ERROR_MESSAGE = """
|
||||
|
||||
<b>Example:</b> link::my password
|
||||
"""
|
||||
|
||||
|
||||
help_string = f"""
|
||||
NOTE: Try each command without any argument to see more detalis.
|
||||
/{BotCommands.MirrorCommand[0]} or /{BotCommands.MirrorCommand[1]}: Start mirroring to cloud.
|
||||
/{BotCommands.QbMirrorCommand[0]} or /{BotCommands.QbMirrorCommand[1]}: Start Mirroring to cloud using qBittorrent.
|
||||
/{BotCommands.JdMirrorCommand[0]} or /{BotCommands.JdMirrorCommand[1]}: Start Mirroring to cloud using JDownloader.
|
||||
/{BotCommands.NzbMirrorCommand[0]} or /{BotCommands.NzbMirrorCommand[1]}: Start Mirroring to cloud using Sabnzbd.
|
||||
/{BotCommands.YtdlCommand[0]} or /{BotCommands.YtdlCommand[1]}: Mirror yt-dlp supported link.
|
||||
/{BotCommands.LeechCommand[0]} or /{BotCommands.LeechCommand[1]}: Start leeching to Telegram.
|
||||
/{BotCommands.QbLeechCommand[0]} or /{BotCommands.QbLeechCommand[1]}: Start leeching using qBittorrent.
|
||||
/{BotCommands.JdLeechCommand[0]} or /{BotCommands.JdLeechCommand[1]}: Start leeching using JDownloader.
|
||||
/{BotCommands.NzbLeechCommand[0]} or /{BotCommands.NzbLeechCommand[1]}: Start leeching using Sabnzbd.
|
||||
/{BotCommands.YtdlLeechCommand[0]} or /{BotCommands.YtdlLeechCommand[1]}: Leech yt-dlp supported link.
|
||||
/{BotCommands.CloneCommand} [drive_url]: Copy file/folder to Google Drive.
|
||||
/{BotCommands.CountCommand} [drive_url]: Count file/folder of Google Drive.
|
||||
/{BotCommands.DeleteCommand} [drive_url]: Delete file/folder from Google Drive (Only Owner & Sudo).
|
||||
/{BotCommands.UserSetCommand[0]} or /{BotCommands.UserSetCommand[1]} [query]: Users settings.
|
||||
/{BotCommands.BotSetCommand[0]} or /{BotCommands.BotSetCommand[1]} [query]: Bot settings.
|
||||
/{BotCommands.SelectCommand}: Select files from torrents or nzb by gid or reply.
|
||||
/{BotCommands.CancelTaskCommand[0]} or /{BotCommands.CancelTaskCommand[1]} [gid]: Cancel task by gid or reply.
|
||||
/{BotCommands.ForceStartCommand[0]} or /{BotCommands.ForceStartCommand[1]} [gid]: Force start task by gid or reply.
|
||||
/{BotCommands.CancelAllCommand} [query]: Cancel all [status] tasks.
|
||||
/{BotCommands.ListCommand} [query]: Search in Google Drive(s).
|
||||
/{BotCommands.SearchCommand} [query]: Search for torrents with API.
|
||||
/{BotCommands.StatusCommand}: Shows a status of all the downloads.
|
||||
/{BotCommands.StatsCommand}: Show stats of the machine where the bot is hosted in.
|
||||
/{BotCommands.PingCommand}: Check how long it takes to Ping the Bot (Only Owner & Sudo).
|
||||
/{BotCommands.AuthorizeCommand}: Authorize a chat or a user to use the bot (Only Owner & Sudo).
|
||||
/{BotCommands.UnAuthorizeCommand}: Unauthorize a chat or a user to use the bot (Only Owner & Sudo).
|
||||
/{BotCommands.UsersCommand}: show users settings (Only Owner & Sudo).
|
||||
/{BotCommands.AddSudoCommand}: Add sudo user (Only Owner).
|
||||
/{BotCommands.RmSudoCommand}: Remove sudo users (Only Owner).
|
||||
/{BotCommands.RestartCommand}: Restart and update the bot (Only Owner & Sudo).
|
||||
/{BotCommands.LogCommand}: Get a log file of the bot. Handy for getting crash reports (Only Owner & Sudo).
|
||||
/{BotCommands.ShellCommand}: Run shell commands (Only Owner).
|
||||
/{BotCommands.AExecCommand}: Exec async functions (Only Owner).
|
||||
/{BotCommands.ExecCommand}: Exec sync functions (Only Owner).
|
||||
/{BotCommands.ClearLocalsCommand}: Clear {BotCommands.AExecCommand} or {BotCommands.ExecCommand} locals (Only Owner).
|
||||
/{BotCommands.RssCommand}: RSS Menu.
|
||||
"""
|
||||
|
@ -4,7 +4,9 @@ from json import dump
|
||||
from random import randint
|
||||
from re import match
|
||||
|
||||
from bot import config_dict, LOGGER, bot_name
|
||||
from ... import LOGGER
|
||||
from ...core.mltb_client import TgClient
|
||||
from ...core.config_manager import Config
|
||||
from .bot_utils import cmd_exec, new_task
|
||||
from myjd import MyJdApi
|
||||
|
||||
@ -21,12 +23,12 @@ class JDownloader(MyJdApi):
|
||||
@new_task
|
||||
async def boot(self):
|
||||
await cmd_exec(["pkill", "-9", "-f", "java"])
|
||||
if not config_dict["JD_EMAIL"] or not config_dict["JD_PASS"]:
|
||||
if not Config.JD_EMAIL or not Config.JD_PASS:
|
||||
self.is_connected = False
|
||||
self.error = "JDownloader Credentials not provided!"
|
||||
return
|
||||
self.error = "Connecting... Try agin after couple of seconds"
|
||||
self._device_name = f"{randint(0, 1000)}@{bot_name}"
|
||||
self._device_name = f"{randint(0, 1000)}@{TgClient.NAME}"
|
||||
if await path.exists("/JDownloader/logs"):
|
||||
LOGGER.info(
|
||||
"Starting JDownloader... This might take up to 10 sec and might restart once if update available!"
|
||||
@ -37,9 +39,9 @@ class JDownloader(MyJdApi):
|
||||
)
|
||||
jdata = {
|
||||
"autoconnectenabledv2": True,
|
||||
"password": config_dict["JD_PASS"],
|
||||
"password": Config.JD_PASS,
|
||||
"devicename": f"{self._device_name}",
|
||||
"email": config_dict["JD_EMAIL"],
|
||||
"email": Config.JD_EMAIL,
|
||||
}
|
||||
remote_data = {
|
||||
"localapiserverheaderaccesscontrollalloworigin": "",
|
||||
|
@ -7,7 +7,8 @@ from re import search as re_search, escape
|
||||
from time import time
|
||||
from aioshutil import rmtree
|
||||
|
||||
from bot import LOGGER, subprocess_lock, DOWNLOAD_DIR
|
||||
from ... import LOGGER, subprocess_lock
|
||||
from ...core.config_manager import Config
|
||||
from .bot_utils import cmd_exec, sync_to_async
|
||||
from .files_utils import ARCH_EXT, get_mime_type
|
||||
|
||||
@ -123,7 +124,7 @@ async def convert_audio(listener, audio_file, ext):
|
||||
async def create_thumb(msg, _id=""):
|
||||
if not _id:
|
||||
_id = msg.id
|
||||
path = f"{DOWNLOAD_DIR}Thumbnails"
|
||||
path = f"{Config.DOWNLOAD_DIR}Thumbnails"
|
||||
else:
|
||||
path = "Thumbnails"
|
||||
await makedirs(path, exist_ok=True)
|
||||
@ -296,7 +297,7 @@ async def take_ss(video_file, ss_nb) -> bool:
|
||||
|
||||
|
||||
async def get_audio_thumbnail(audio_file):
|
||||
output_dir = f"{DOWNLOAD_DIR}Thumbnails"
|
||||
output_dir = f"{Config.DOWNLOAD_DIR}Thumbnails"
|
||||
await makedirs(output_dir, exist_ok=True)
|
||||
output = ospath.join(output_dir, f"{time()}.jpg")
|
||||
cmd = [
|
||||
@ -323,7 +324,7 @@ async def get_audio_thumbnail(audio_file):
|
||||
|
||||
|
||||
async def get_video_thumbnail(video_file, duration):
|
||||
output_dir = f"{DOWNLOAD_DIR}Thumbnails"
|
||||
output_dir = f"{Config.DOWNLOAD_DIR}Thumbnails"
|
||||
await makedirs(output_dir, exist_ok=True)
|
||||
output = ospath.join(output_dir, f"{time()}.jpg")
|
||||
if duration is None:
|
||||
@ -371,7 +372,7 @@ async def get_multiple_frames_thumbnail(video_file, layout, keep_screenshots):
|
||||
dirpath = await take_ss(video_file, ss_nb)
|
||||
if not dirpath:
|
||||
return None
|
||||
output_dir = f"{DOWNLOAD_DIR}Thumbnails"
|
||||
output_dir = f"{Config.DOWNLOAD_DIR}Thumbnails"
|
||||
await makedirs(output_dir, exist_ok=True)
|
||||
output = ospath.join(output_dir, f"{time()}.jpg")
|
||||
cmd = [
|
||||
|
@ -3,14 +3,13 @@ from psutil import virtual_memory, cpu_percent, disk_usage
|
||||
from time import time
|
||||
from asyncio import iscoroutinefunction
|
||||
|
||||
from bot import (
|
||||
DOWNLOAD_DIR,
|
||||
from ... import (
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
bot_start_time,
|
||||
config_dict,
|
||||
status_dict,
|
||||
)
|
||||
from ...core.config_manager import Config
|
||||
from .bot_utils import sync_to_async
|
||||
from ..telegram_helper.button_build import ButtonMaker
|
||||
|
||||
@ -169,7 +168,7 @@ async def get_readable_message(sid, is_user, page_no=1, status="All", page_step=
|
||||
|
||||
tasks = await sync_to_async(get_specific_tasks, status, sid if is_user else None)
|
||||
|
||||
STATUS_LIMIT = config_dict["STATUS_LIMIT"]
|
||||
STATUS_LIMIT = Config.STATUS_LIMIT
|
||||
tasks_no = len(tasks)
|
||||
pages = (max(tasks_no, 1) + STATUS_LIMIT - 1) // STATUS_LIMIT
|
||||
if page_no > pages:
|
||||
@ -241,6 +240,6 @@ async def get_readable_message(sid, is_user, page_no=1, status="All", page_step=
|
||||
buttons.data_button(label, f"status {sid} st {status_value}")
|
||||
buttons.data_button("♻️", f"status {sid} ref", position="header")
|
||||
button = buttons.build_menu(8)
|
||||
msg += f"<b>CPU:</b> {cpu_percent()}% | <b>FREE:</b> {get_readable_file_size(disk_usage(DOWNLOAD_DIR).free)}"
|
||||
msg += f"<b>CPU:</b> {cpu_percent()}% | <b>FREE:</b> {get_readable_file_size(disk_usage(Config.DOWNLOAD_DIR).free)}"
|
||||
msg += f"\n<b>RAM:</b> {virtual_memory().percent}% | <b>UPTIME:</b> {get_readable_time(time() - bot_start_time)}"
|
||||
return msg, button
|
||||
|
@ -1,7 +1,6 @@
|
||||
from asyncio import Event
|
||||
|
||||
from bot import (
|
||||
config_dict,
|
||||
from ... import (
|
||||
queued_dl,
|
||||
queued_up,
|
||||
non_queued_up,
|
||||
@ -9,10 +8,11 @@ from bot import (
|
||||
queue_dict_lock,
|
||||
LOGGER,
|
||||
)
|
||||
from ...core.config_manager import Config
|
||||
from ..mirror_leech_utils.gdrive_utils.search import GoogleDriveSearch
|
||||
from .bot_utils import sync_to_async, get_telegraph_list
|
||||
from .files_utils import get_base_name
|
||||
from .links_utils import is_gdrive_id
|
||||
from ..mirror_leech_utils.gdrive_utils.search import GoogleDriveSearch
|
||||
|
||||
|
||||
async def stop_duplicate_check(listener):
|
||||
@ -54,10 +54,8 @@ async def stop_duplicate_check(listener):
|
||||
|
||||
|
||||
async def check_running_tasks(listener, state="dl"):
|
||||
all_limit = config_dict["QUEUE_ALL"]
|
||||
state_limit = (
|
||||
config_dict["QUEUE_DOWNLOAD"] if state == "dl" else config_dict["QUEUE_UPLOAD"]
|
||||
)
|
||||
all_limit = Config.QUEUE_ALL
|
||||
state_limit = Config.QUEUE_DOWNLOAD if state == "dl" else Config.QUEUE_UPLOAD
|
||||
event = None
|
||||
is_over_limit = False
|
||||
async with queue_dict_lock:
|
||||
@ -105,9 +103,9 @@ async def start_up_from_queued(mid: int):
|
||||
|
||||
|
||||
async def start_from_queued():
|
||||
if all_limit := config_dict["QUEUE_ALL"]:
|
||||
dl_limit = config_dict["QUEUE_DOWNLOAD"]
|
||||
up_limit = config_dict["QUEUE_UPLOAD"]
|
||||
if all_limit := Config.QUEUE_ALL:
|
||||
dl_limit = Config.QUEUE_DOWNLOAD
|
||||
up_limit = Config.QUEUE_UPLOAD
|
||||
async with queue_dict_lock:
|
||||
dl = len(non_queued_dl)
|
||||
up = len(non_queued_up)
|
||||
@ -127,7 +125,7 @@ async def start_from_queued():
|
||||
break
|
||||
return
|
||||
|
||||
if up_limit := config_dict["QUEUE_UPLOAD"]:
|
||||
if up_limit := Config.QUEUE_UPLOAD:
|
||||
async with queue_dict_lock:
|
||||
up = len(non_queued_up)
|
||||
if queued_up and up < up_limit:
|
||||
@ -142,7 +140,7 @@ async def start_from_queued():
|
||||
for mid in list(queued_up.keys()):
|
||||
await start_up_from_queued(mid)
|
||||
|
||||
if dl_limit := config_dict["QUEUE_DOWNLOAD"]:
|
||||
if dl_limit := Config.QUEUE_DOWNLOAD:
|
||||
async with queue_dict_lock:
|
||||
dl = len(non_queued_dl)
|
||||
if queued_dl and dl < dl_limit:
|
||||
|
@ -3,7 +3,7 @@ from secrets import token_urlsafe
|
||||
from telegraph.aio import Telegraph
|
||||
from telegraph.exceptions import RetryAfterError
|
||||
|
||||
from bot import LOGGER
|
||||
from ... import LOGGER
|
||||
|
||||
|
||||
class TelegraphHelper:
|
||||
|
@ -2,7 +2,8 @@ from aiofiles.os import remove, path as aiopath
|
||||
from asyncio import sleep
|
||||
from time import time
|
||||
|
||||
from bot import aria2, task_dict_lock, task_dict, LOGGER, config_dict, intervals
|
||||
from ... import aria2, task_dict_lock, task_dict, LOGGER, intervals
|
||||
from ...core.config_manager import Config
|
||||
from ..ext_utils.bot_utils import loop_thread, bt_selection_buttons, sync_to_async
|
||||
from ..ext_utils.files_utils import clean_unwanted
|
||||
from ..ext_utils.status_utils import get_task_by_gid
|
||||
@ -64,7 +65,7 @@ async def _on_download_complete(api, gid):
|
||||
LOGGER.info(f"Gid changed from {gid} to {new_gid}")
|
||||
if task := await get_task_by_gid(new_gid):
|
||||
task.listener.is_torrent = True
|
||||
if config_dict["BASE_URL"] and task.listener.select:
|
||||
if Config.BASE_URL and task.listener.select:
|
||||
if not task.queued:
|
||||
await sync_to_async(api.client.force_pause, new_gid)
|
||||
SBUTTONS = bt_selection_buttons(new_gid)
|
||||
|
@ -1,6 +1,6 @@
|
||||
from time import sleep
|
||||
|
||||
from bot import LOGGER, aria2
|
||||
from ... import LOGGER, aria2
|
||||
from ..ext_utils.bot_utils import async_to_sync, sync_to_async
|
||||
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
from asyncio import sleep
|
||||
|
||||
from bot import intervals, jd_lock, jd_downloads
|
||||
from ... import intervals, jd_lock, jd_downloads
|
||||
from ..ext_utils.bot_utils import new_task
|
||||
from ..ext_utils.jdownloader_booter import jdownloader
|
||||
from ..ext_utils.status_utils import get_task_by_gid
|
||||
|
@ -1,6 +1,6 @@
|
||||
from asyncio import sleep, gather
|
||||
|
||||
from bot import (
|
||||
from ... import (
|
||||
intervals,
|
||||
sabnzbd_client,
|
||||
nzb_jobs,
|
||||
|
@ -2,16 +2,16 @@ from aiofiles.os import remove, path as aiopath
|
||||
from asyncio import sleep
|
||||
from time import time
|
||||
|
||||
from bot import (
|
||||
from ... import (
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
intervals,
|
||||
qbittorrent_client,
|
||||
config_dict,
|
||||
qb_torrents,
|
||||
qb_listener_lock,
|
||||
LOGGER,
|
||||
)
|
||||
from ...core.config_manager import Config
|
||||
from ..ext_utils.bot_utils import new_task, sync_to_async
|
||||
from ..ext_utils.files_utils import clean_unwanted
|
||||
from ..ext_utils.status_utils import get_readable_time, get_task_by_gid
|
||||
@ -127,11 +127,11 @@ async def _qb_listener():
|
||||
continue
|
||||
state = tor_info.state
|
||||
if state == "metaDL":
|
||||
TORRENT_TIMEOUT = config_dict["TORRENT_TIMEOUT"]
|
||||
qb_torrents[tag]["stalled_time"] = time()
|
||||
if (
|
||||
TORRENT_TIMEOUT
|
||||
and time() - tor_info.added_on >= TORRENT_TIMEOUT
|
||||
Config.TORRENT_TIMEOUT
|
||||
and time() - qb_torrents[tag]["start_time"]
|
||||
>= Config.TORRENT_TIMEOUT
|
||||
):
|
||||
await _on_download_error("Dead Torrent!", tor_info)
|
||||
else:
|
||||
@ -145,7 +145,6 @@ async def _qb_listener():
|
||||
qb_torrents[tag]["stop_dup_check"] = True
|
||||
await _stop_duplicate(tor_info)
|
||||
elif state == "stalledDL":
|
||||
TORRENT_TIMEOUT = config_dict["TORRENT_TIMEOUT"]
|
||||
if (
|
||||
not qb_torrents[tag]["rechecked"]
|
||||
and 0.99989999999999999 < tor_info.progress < 1
|
||||
@ -160,9 +159,9 @@ async def _qb_listener():
|
||||
)
|
||||
qb_torrents[tag]["rechecked"] = True
|
||||
elif (
|
||||
TORRENT_TIMEOUT
|
||||
Config.TORRENT_TIMEOUT
|
||||
and time() - qb_torrents[tag]["stalled_time"]
|
||||
>= TORRENT_TIMEOUT
|
||||
>= Config.TORRENT_TIMEOUT
|
||||
):
|
||||
await _on_download_error("Dead Torrent!", tor_info)
|
||||
else:
|
||||
@ -202,6 +201,7 @@ async def _qb_listener():
|
||||
async def on_download_start(tag):
|
||||
async with qb_listener_lock:
|
||||
qb_torrents[tag] = {
|
||||
"start_time": time(),
|
||||
"stalled_time": time(),
|
||||
"stop_dup_check": False,
|
||||
"rechecked": False,
|
||||
|
@ -4,15 +4,12 @@ from asyncio import sleep, gather
|
||||
from html import escape
|
||||
from requests import utils as rutils
|
||||
|
||||
from bot import (
|
||||
from ... import (
|
||||
intervals,
|
||||
aria2,
|
||||
DOWNLOAD_DIR,
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
LOGGER,
|
||||
DATABASE_URL,
|
||||
config_dict,
|
||||
non_queued_up,
|
||||
non_queued_dl,
|
||||
queued_up,
|
||||
@ -20,6 +17,7 @@ from bot import (
|
||||
queue_dict_lock,
|
||||
same_directory_lock,
|
||||
)
|
||||
from ...core.config_manager import Config
|
||||
from ..common import TaskConfig
|
||||
from ..ext_utils.bot_utils import sync_to_async
|
||||
from ..ext_utils.db_handler import database
|
||||
@ -74,8 +72,8 @@ class TaskListener(TaskConfig):
|
||||
async def on_download_start(self):
|
||||
if (
|
||||
self.is_super_chat
|
||||
and config_dict["INCOMPLETE_TASK_NOTIFIER"]
|
||||
and DATABASE_URL
|
||||
and Config.INCOMPLETE_TASK_NOTIFIER
|
||||
and Config.DATABASE_URL
|
||||
):
|
||||
await database.add_incomplete_task(
|
||||
self.message.chat.id, self.message.link, self.tag
|
||||
@ -107,7 +105,7 @@ class TaskListener(TaskConfig):
|
||||
des_id = list(self.same_dir[self.folder_name]["tasks"])[
|
||||
0
|
||||
]
|
||||
des_path = f"{DOWNLOAD_DIR}{des_id}{self.folder_name}"
|
||||
des_path = f"{Config.DOWNLOAD_DIR}{des_id}{self.folder_name}"
|
||||
await makedirs(des_path, exist_ok=True)
|
||||
LOGGER.info(f"Moving files from {self.mid} to {des_id}")
|
||||
for item in await listdir(spath):
|
||||
@ -157,7 +155,7 @@ class TaskListener(TaskConfig):
|
||||
|
||||
up_path = f"{self.dir}/{self.name}"
|
||||
self.size = await get_path_size(up_path)
|
||||
if not config_dict["QUEUE_ALL"]:
|
||||
if not Config.QUEUE_ALL:
|
||||
async with queue_dict_lock:
|
||||
if self.mid in non_queued_dl:
|
||||
non_queued_dl.remove(self.mid)
|
||||
@ -281,8 +279,8 @@ class TaskListener(TaskConfig):
|
||||
):
|
||||
if (
|
||||
self.is_super_chat
|
||||
and config_dict["INCOMPLETE_TASK_NOTIFIER"]
|
||||
and DATABASE_URL
|
||||
and Config.INCOMPLETE_TASK_NOTIFIER
|
||||
and Config.DATABASE_URL
|
||||
):
|
||||
await database.rm_complete_task(self.message.link)
|
||||
msg = f"<b>Name: </b><code>{escape(self.name)}</code>\n\n<b>Size: </b>{get_readable_file_size(self.size)}"
|
||||
@ -312,7 +310,7 @@ class TaskListener(TaskConfig):
|
||||
if (
|
||||
link
|
||||
or rclone_path
|
||||
and config_dict["RCLONE_SERVE_URL"]
|
||||
and Config.RCLONE_SERVE_URL
|
||||
and not self.private_link
|
||||
):
|
||||
buttons = ButtonMaker()
|
||||
@ -320,14 +318,10 @@ class TaskListener(TaskConfig):
|
||||
buttons.url_button("☁️ Cloud Link", link)
|
||||
else:
|
||||
msg += f"\n\nPath: <code>{rclone_path}</code>"
|
||||
if (
|
||||
rclone_path
|
||||
and (RCLONE_SERVE_URL := config_dict["RCLONE_SERVE_URL"])
|
||||
and not self.private_link
|
||||
):
|
||||
if rclone_path and Config.RCLONE_SERVE_URL and not self.private_link:
|
||||
remote, path = rclone_path.split(":", 1)
|
||||
url_path = rutils.quote(f"{path}")
|
||||
share_url = f"{RCLONE_SERVE_URL}/{remote}/{url_path}"
|
||||
share_url = f"{Config.RCLONE_SERVE_URL}/{remote}/{url_path}"
|
||||
if mime_type == "Folder":
|
||||
share_url += "/"
|
||||
buttons.url_button("🔗 Rclone Link", share_url)
|
||||
@ -335,8 +329,8 @@ class TaskListener(TaskConfig):
|
||||
INDEX_URL = ""
|
||||
if self.private_link:
|
||||
INDEX_URL = self.user_dict.get("index_url", "") or ""
|
||||
elif config_dict["INDEX_URL"]:
|
||||
INDEX_URL = config_dict["INDEX_URL"]
|
||||
elif Config.INDEX_URL:
|
||||
INDEX_URL = Config.INDEX_URL
|
||||
if INDEX_URL:
|
||||
share_url = f"{INDEX_URL}findpath?id={dir_id}"
|
||||
buttons.url_button("⚡ Index Link", share_url)
|
||||
@ -388,8 +382,8 @@ class TaskListener(TaskConfig):
|
||||
|
||||
if (
|
||||
self.is_super_chat
|
||||
and config_dict["INCOMPLETE_TASK_NOTIFIER"]
|
||||
and DATABASE_URL
|
||||
and Config.INCOMPLETE_TASK_NOTIFIER
|
||||
and Config.DATABASE_URL
|
||||
):
|
||||
await database.rm_complete_task(self.message.link)
|
||||
|
||||
@ -426,8 +420,8 @@ class TaskListener(TaskConfig):
|
||||
|
||||
if (
|
||||
self.is_super_chat
|
||||
and config_dict["INCOMPLETE_TASK_NOTIFIER"]
|
||||
and DATABASE_URL
|
||||
and Config.INCOMPLETE_TASK_NOTIFIER
|
||||
and Config.DATABASE_URL
|
||||
):
|
||||
await database.rm_complete_task(self.message.link)
|
||||
|
||||
|
@ -1,14 +1,7 @@
|
||||
from aiofiles.os import remove, path as aiopath
|
||||
|
||||
from bot import (
|
||||
aria2,
|
||||
task_dict_lock,
|
||||
task_dict,
|
||||
LOGGER,
|
||||
config_dict,
|
||||
aria2_options,
|
||||
aria2c_global,
|
||||
)
|
||||
from .... import aria2, task_dict_lock, task_dict, LOGGER
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.bot_utils import bt_selection_buttons, sync_to_async
|
||||
from ...ext_utils.task_manager import check_running_tasks
|
||||
from ...mirror_leech_utils.status_utils.aria2_status import Aria2Status
|
||||
@ -16,9 +9,7 @@ from ...telegram_helper.message_utils import send_status_message, send_message
|
||||
|
||||
|
||||
async def add_aria2c_download(listener, dpath, header, ratio, seed_time):
|
||||
a2c_opt = {**aria2_options}
|
||||
[a2c_opt.pop(k) for k in aria2c_global if k in aria2_options]
|
||||
a2c_opt["dir"] = dpath
|
||||
a2c_opt = {"dir": dpath}
|
||||
if listener.name:
|
||||
a2c_opt["out"] = listener.name
|
||||
if header:
|
||||
@ -27,7 +18,7 @@ async def add_aria2c_download(listener, dpath, header, ratio, seed_time):
|
||||
a2c_opt["seed-ratio"] = ratio
|
||||
if seed_time:
|
||||
a2c_opt["seed-time"] = seed_time
|
||||
if TORRENT_TIMEOUT := config_dict["TORRENT_TIMEOUT"]:
|
||||
if TORRENT_TIMEOUT := Config.TORRENT_TIMEOUT:
|
||||
a2c_opt["bt-stop-timeout"] = f"{TORRENT_TIMEOUT}"
|
||||
|
||||
add_to_queue, event = await check_running_tasks(listener)
|
||||
@ -66,7 +57,7 @@ async def add_aria2c_download(listener, dpath, header, ratio, seed_time):
|
||||
|
||||
if (
|
||||
not add_to_queue
|
||||
and (not listener.select or not config_dict["BASE_URL"])
|
||||
and (not listener.select or not Config.BASE_URL)
|
||||
and listener.multi <= 1
|
||||
):
|
||||
await send_status_message(listener.message)
|
||||
|
@ -1,9 +1,7 @@
|
||||
from secrets import token_urlsafe
|
||||
|
||||
from bot import (
|
||||
from .... import (
|
||||
LOGGER,
|
||||
aria2_options,
|
||||
aria2c_global,
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
)
|
||||
@ -44,12 +42,9 @@ async def add_direct_download(listener, path):
|
||||
if listener.is_cancelled:
|
||||
return
|
||||
|
||||
a2c_opt = {**aria2_options}
|
||||
[a2c_opt.pop(k) for k in aria2c_global if k in aria2_options]
|
||||
a2c_opt = {"follow-torrent": "false", "follow-metalink": "false"}
|
||||
if header := details.get("header"):
|
||||
a2c_opt["header"] = header
|
||||
a2c_opt["follow-torrent"] = "false"
|
||||
a2c_opt["follow-metalink"] = "false"
|
||||
directListener = DirectListener(path, listener, a2c_opt)
|
||||
|
||||
async with task_dict_lock:
|
||||
|
@ -13,7 +13,7 @@ from urllib3.util.retry import Retry
|
||||
from uuid import uuid4
|
||||
from base64 import b64decode
|
||||
|
||||
from bot import config_dict
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.exceptions import DirectDownloadLinkException
|
||||
from ...ext_utils.help_messages import PASSWORD_ERROR_MESSAGE
|
||||
from ...ext_utils.links_utils import is_share_link
|
||||
@ -1451,7 +1451,7 @@ def filelions_and_streamwish(url):
|
||||
"mycloudz.cc",
|
||||
]
|
||||
):
|
||||
apiKey = config_dict["FILELION_API"]
|
||||
apiKey = Config.FILELION_API
|
||||
apiUrl = "https://vidhideapi.com"
|
||||
elif any(
|
||||
x in hostname
|
||||
@ -1463,7 +1463,7 @@ def filelions_and_streamwish(url):
|
||||
"streamwish.to",
|
||||
]
|
||||
):
|
||||
apiKey = config_dict["STREAMWISH_API"]
|
||||
apiKey = Config.STREAMWISH_API
|
||||
apiUrl = "https://api.streamwish.com"
|
||||
if not apiKey:
|
||||
raise DirectDownloadLinkException(
|
||||
|
@ -1,6 +1,6 @@
|
||||
from secrets import token_urlsafe
|
||||
|
||||
from bot import task_dict, task_dict_lock, LOGGER
|
||||
from .... import task_dict, task_dict_lock, LOGGER
|
||||
from ...ext_utils.bot_utils import sync_to_async
|
||||
from ...ext_utils.task_manager import check_running_tasks, stop_duplicate_check
|
||||
from ...mirror_leech_utils.gdrive_utils.count import GoogleDriveCount
|
||||
|
@ -9,7 +9,7 @@ from base64 import b64encode
|
||||
from secrets import token_urlsafe
|
||||
from myjd.exception import MYJDException
|
||||
|
||||
from bot import (
|
||||
from .... import (
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
LOGGER,
|
||||
@ -367,13 +367,13 @@ async def add_jd_download(listener, path):
|
||||
links_to_remove = []
|
||||
force_download = False
|
||||
for dlink in links:
|
||||
if dlink["status"] == "Invalid download directory":
|
||||
if dlink.get("status", "") == "Invalid download directory":
|
||||
force_download = True
|
||||
new_name, ext = dlink["name"].rsplit(".", 1)
|
||||
new_name = new_name[: 250 - len(f".{ext}".encode())]
|
||||
new_name = f"{new_name}.{ext}"
|
||||
await jdownloader.device.downloads.rename_link(dlink["uuid"], new_name)
|
||||
elif dlink["status"] == "HLS stream broken?":
|
||||
elif dlink.get("status", "") == "HLS stream broken?":
|
||||
links_to_remove.append(dlink["uuid"])
|
||||
|
||||
if links_to_remove:
|
||||
|
@ -2,13 +2,13 @@ from aiofiles.os import remove, path as aiopath
|
||||
from asyncio import gather, sleep
|
||||
from sabnzbdapi.exception import NotLoggedIn, LoginFailed
|
||||
|
||||
from bot import (
|
||||
from .... import (
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
sabnzbd_client,
|
||||
LOGGER,
|
||||
config_dict,
|
||||
)
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.task_manager import check_running_tasks
|
||||
from ...listeners.nzb_listener import on_download_start
|
||||
from ...ext_utils.db_handler import database
|
||||
@ -21,15 +21,13 @@ async def add_servers():
|
||||
if res and (servers := res["servers"]):
|
||||
tasks = []
|
||||
servers_hosts = [x["host"] for x in servers]
|
||||
for server in list(config_dict["USENET_SERVERS"]):
|
||||
for server in list(Config.USENET_SERVERS):
|
||||
if server["host"] not in servers_hosts:
|
||||
tasks.append(sabnzbd_client.add_server(server))
|
||||
config_dict["USENET_SERVERS"].append(server)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
Config.USENET_SERVERS.append(server)
|
||||
if Config.DATABASE_URL:
|
||||
tasks.append(
|
||||
database.update_config(
|
||||
{"USENET_SERVERS": config_dict["USENET_SERVERS"]}
|
||||
)
|
||||
database.update_config({"USENET_SERVERS": Config.USENET_SERVERS})
|
||||
)
|
||||
if tasks:
|
||||
try:
|
||||
@ -37,19 +35,18 @@ async def add_servers():
|
||||
except LoginFailed as e:
|
||||
raise e
|
||||
elif not res and (
|
||||
config_dict["USENET_SERVERS"]
|
||||
Config.USENET_SERVERS
|
||||
and (
|
||||
not config_dict["USENET_SERVERS"][0]["host"]
|
||||
or not config_dict["USENET_SERVERS"][0]["username"]
|
||||
or not config_dict["USENET_SERVERS"][0]["password"]
|
||||
not Config.USENET_SERVERS[0]["host"]
|
||||
or not Config.USENET_SERVERS[0]["username"]
|
||||
or not Config.USENET_SERVERS[0]["password"]
|
||||
)
|
||||
or not config_dict["USENET_SERVERS"]
|
||||
or not Config.USENET_SERVERS
|
||||
):
|
||||
raise NotLoggedIn("Set USENET_SERVERS in bsetting or config!")
|
||||
else:
|
||||
if tasks := [
|
||||
sabnzbd_client.add_server(server)
|
||||
for server in config_dict["USENET_SERVERS"]
|
||||
sabnzbd_client.add_server(server) for server in Config.USENET_SERVERS
|
||||
]:
|
||||
try:
|
||||
await gather(*tasks)
|
||||
|
@ -1,13 +1,13 @@
|
||||
from aiofiles.os import remove, path as aiopath
|
||||
from asyncio import sleep
|
||||
|
||||
from bot import (
|
||||
from .... import (
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
qbittorrent_client,
|
||||
LOGGER,
|
||||
config_dict,
|
||||
)
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.bot_utils import bt_selection_buttons, sync_to_async
|
||||
from ...ext_utils.task_manager import check_running_tasks
|
||||
from ...listeners.qbit_listener import on_download_start
|
||||
@ -89,7 +89,7 @@ async def add_qb_torrent(listener, path, ratio, seed_time):
|
||||
|
||||
await listener.on_download_start()
|
||||
|
||||
if config_dict["BASE_URL"] and listener.select:
|
||||
if Config.BASE_URL and listener.select:
|
||||
if listener.link.startswith("magnet:"):
|
||||
metamsg = "Downloading Metadata, wait then you can select files. Use torrent file to avoid this wait."
|
||||
meta = await send_message(listener.message, metamsg)
|
||||
@ -134,6 +134,7 @@ async def add_qb_torrent(listener, path, ratio, seed_time):
|
||||
LOGGER.info(
|
||||
f"Start Queued Download from Qbittorrent: {tor_info.name} - Hash: {ext_hash}"
|
||||
)
|
||||
await on_download_start(f"{listener.mid}")
|
||||
await sync_to_async(
|
||||
qbittorrent_client.torrents_start, torrent_hashes=ext_hash
|
||||
)
|
||||
|
@ -3,7 +3,7 @@ from json import loads
|
||||
from secrets import token_urlsafe
|
||||
from aiofiles.os import remove
|
||||
|
||||
from bot import task_dict, task_dict_lock, LOGGER
|
||||
from .... import task_dict, task_dict_lock, LOGGER
|
||||
from ...ext_utils.bot_utils import cmd_exec
|
||||
from ...ext_utils.task_manager import check_running_tasks, stop_duplicate_check
|
||||
from ...mirror_leech_utils.rclone_utils.transfer import RcloneTransferHelper
|
||||
@ -38,6 +38,11 @@ async def add_rclone_download(listener, path):
|
||||
"--config",
|
||||
config_path,
|
||||
f"{remote}:{rpath}",
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
cmd2 = [
|
||||
"rclone",
|
||||
@ -47,13 +52,21 @@ async def add_rclone_download(listener, path):
|
||||
"--config",
|
||||
config_path,
|
||||
f"{remote}:{rpath}",
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
if rclone_select:
|
||||
cmd2.extend(("--files-from", listener.link))
|
||||
res = await cmd_exec(cmd2)
|
||||
if res[2] != 0:
|
||||
if res[2] != -9:
|
||||
err = (res[1]or "Use <code>/shell cat rlog.txt</code> to see more information")
|
||||
err = (
|
||||
res[1]
|
||||
or "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
)
|
||||
msg = f"Error: While getting rclone stat/size. Path: {remote}:{listener.link}. Stderr: {err[:4000]}"
|
||||
await listener.on_download_error(msg)
|
||||
return
|
||||
@ -86,7 +99,7 @@ async def add_rclone_download(listener, path):
|
||||
if not str(err):
|
||||
err = "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
await listener.on_download_error(f"RcloneDownload JsonLoad: {err}")
|
||||
return
|
||||
return
|
||||
if rstat["IsDir"]:
|
||||
if not listener.name:
|
||||
listener.name = (
|
||||
|
@ -2,13 +2,12 @@ from asyncio import Lock, sleep
|
||||
from time import time
|
||||
from pyrogram.errors import FloodWait, FloodPremiumWait
|
||||
|
||||
from bot import (
|
||||
from .... import (
|
||||
LOGGER,
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
bot,
|
||||
user,
|
||||
)
|
||||
from ....core.mltb_client import TgClient
|
||||
from ...ext_utils.task_manager import check_running_tasks, stop_duplicate_check
|
||||
from ...mirror_leech_utils.status_utils.queue_status import QueueStatus
|
||||
from ...mirror_leech_utils.status_utils.telegram_status import TelegramStatus
|
||||
@ -50,12 +49,12 @@ class TelegramDownloadHelper:
|
||||
else:
|
||||
LOGGER.info(f"Start Queued Download from Telegram: {self._listener.name}")
|
||||
|
||||
async def _on_download_progress(self, current, total):
|
||||
async def _on_download_progress(self, current, _):
|
||||
if self._listener.is_cancelled:
|
||||
if self.session == "user":
|
||||
user.stop_transmission()
|
||||
TgClient.user.stop_transmission()
|
||||
else:
|
||||
bot.stop_transmission()
|
||||
TgClient.bot.stop_transmission()
|
||||
self._processed_bytes = current
|
||||
|
||||
async def _on_download_error(self, error):
|
||||
@ -75,11 +74,12 @@ class TelegramDownloadHelper:
|
||||
file_name=path, progress=self._on_download_progress
|
||||
)
|
||||
if self._listener.is_cancelled:
|
||||
await self._on_download_error("Cancelled by user!")
|
||||
return
|
||||
except (FloodWait, FloodPremiumWait) as f:
|
||||
LOGGER.warning(str(f))
|
||||
await sleep(f.value)
|
||||
await self._download(message, path)
|
||||
return
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
await self._on_download_error(str(e))
|
||||
@ -97,7 +97,7 @@ class TelegramDownloadHelper:
|
||||
and self._listener.is_super_chat
|
||||
):
|
||||
self.session = "user"
|
||||
message = await user.get_messages(
|
||||
message = await TgClient.user.get_messages(
|
||||
chat_id=message.chat.id, message_ids=message.id
|
||||
)
|
||||
elif self.session != "user":
|
||||
@ -165,3 +165,4 @@ class TelegramDownloadHelper:
|
||||
LOGGER.info(
|
||||
f"Cancelling download on user request: name: {self._listener.name} id: {self._id}"
|
||||
)
|
||||
await self._on_download_error("Cancelled by user!")
|
||||
|
@ -4,7 +4,7 @@ from re import search as re_search
|
||||
from secrets import token_urlsafe
|
||||
from yt_dlp import YoutubeDL, DownloadError
|
||||
|
||||
from bot import task_dict_lock, task_dict
|
||||
from .... import task_dict_lock, task_dict
|
||||
from ...ext_utils.bot_utils import sync_to_async, async_to_sync
|
||||
from ...ext_utils.task_manager import check_running_tasks, stop_duplicate_check
|
||||
from ...mirror_leech_utils.status_utils.queue_status import QueueStatus
|
||||
|
@ -1,7 +1,7 @@
|
||||
from googleapiclient.errors import HttpError
|
||||
from logging import getLogger
|
||||
|
||||
from bot.helper.mirror_leech_utils.gdrive_utils.helper import GoogleDriveHelper
|
||||
from ....helper.mirror_leech_utils.gdrive_utils.helper import GoogleDriveHelper
|
||||
|
||||
LOGGER = getLogger(__name__)
|
||||
|
||||
|
@ -7,15 +7,15 @@ from os import path as ospath, listdir
|
||||
from pickle import load as pload
|
||||
from random import randrange
|
||||
from re import search as re_search
|
||||
from urllib.parse import parse_qs, urlparse
|
||||
from tenacity import (
|
||||
retry,
|
||||
wait_exponential,
|
||||
stop_after_attempt,
|
||||
retry_if_exception_type,
|
||||
)
|
||||
from urllib.parse import parse_qs, urlparse
|
||||
|
||||
from bot import config_dict
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.links_utils import is_gdrive_id
|
||||
|
||||
LOGGER = getLogger(__name__)
|
||||
@ -46,7 +46,7 @@ class GoogleDriveHelper:
|
||||
self.total_time = 0
|
||||
self.status = None
|
||||
self.update_interval = 3
|
||||
self.use_sa = config_dict["USE_SERVICE_ACCOUNTS"]
|
||||
self.use_sa = Config.USE_SERVICE_ACCOUNTS
|
||||
|
||||
@property
|
||||
def speed(self):
|
||||
@ -208,7 +208,7 @@ class GoogleDriveHelper:
|
||||
.execute()
|
||||
)
|
||||
file_id = file.get("id")
|
||||
if not config_dict["IS_TEAM_DRIVE"]:
|
||||
if not Config.IS_TEAM_DRIVE:
|
||||
self.set_permission(file_id)
|
||||
LOGGER.info(f'Created G-Drive Folder:\nName: {file.get("name")}\nID: {file_id}')
|
||||
return file_id
|
||||
|
@ -8,7 +8,7 @@ from pyrogram.handlers import CallbackQueryHandler
|
||||
from tenacity import RetryError
|
||||
from time import time
|
||||
|
||||
from bot import config_dict
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.bot_utils import update_user_ldata, new_task
|
||||
from ...ext_utils.db_handler import database
|
||||
from ...ext_utils.status_utils import get_readable_file_size, get_readable_time
|
||||
@ -90,7 +90,7 @@ async def id_updates(_, query, obj):
|
||||
if id_ != obj.listener.user_dict.get("gdrive_id"):
|
||||
update_user_ldata(obj.listener.user_id, "gdrive_id", id_)
|
||||
await obj.get_items_buttons()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
if Config.DATABASE_URL:
|
||||
await database.update_user_data(obj.listener.user_id)
|
||||
elif data[1] == "owner":
|
||||
obj.token_path = "token.pickle"
|
||||
@ -211,7 +211,7 @@ class GoogleDriveList(GoogleDriveHelper):
|
||||
)
|
||||
if self.list_status == "gdu":
|
||||
default_id = (
|
||||
self.listener.user_dict.get("gdrive_id") or config_dict["GDRIVE_ID"]
|
||||
self.listener.user_dict.get("gdrive_id") or Config.GDRIVE_ID
|
||||
)
|
||||
msg += f"\nDefault Gdrive ID: {default_id}" if default_id else ""
|
||||
msg += f"\n\nItems: {items_no}"
|
||||
@ -225,9 +225,9 @@ class GoogleDriveList(GoogleDriveHelper):
|
||||
|
||||
async def get_items(self, itype=""):
|
||||
if itype:
|
||||
self.item_type == itype
|
||||
self.item_type = itype
|
||||
elif self.list_status == "gdu":
|
||||
self.item_type == "folders"
|
||||
self.item_type = "folders"
|
||||
try:
|
||||
files = self.get_files_by_folder_id(self.id, self.item_type)
|
||||
if self.listener.is_cancelled:
|
||||
@ -242,10 +242,11 @@ class GoogleDriveList(GoogleDriveHelper):
|
||||
if len(files) == 0 and itype != self.item_type and self.list_status == "gdd":
|
||||
itype = "folders" if self.item_type == "files" else "files"
|
||||
self.item_type = itype
|
||||
return await self.get_items(itype)
|
||||
self.items_list = natsorted(files)
|
||||
self.iter_start = 0
|
||||
await self.get_items_buttons()
|
||||
await self.get_items(itype)
|
||||
else:
|
||||
self.items_list = natsorted(files)
|
||||
self.iter_start = 0
|
||||
await self.get_items_buttons()
|
||||
|
||||
async def list_drives(self):
|
||||
self.service = self.authorize()
|
||||
|
@ -1,8 +1,8 @@
|
||||
from logging import getLogger
|
||||
|
||||
from bot import drives_names, drives_ids, index_urls, user_data
|
||||
from bot.helper.ext_utils.status_utils import get_readable_file_size
|
||||
from bot.helper.mirror_leech_utils.gdrive_utils.helper import GoogleDriveHelper
|
||||
from .... import drives_names, drives_ids, index_urls, user_data
|
||||
from ....helper.ext_utils.status_utils import get_readable_file_size
|
||||
from ....helper.mirror_leech_utils.gdrive_utils.helper import GoogleDriveHelper
|
||||
|
||||
LOGGER = getLogger(__name__)
|
||||
|
||||
|
@ -10,7 +10,7 @@ from tenacity import (
|
||||
RetryError,
|
||||
)
|
||||
|
||||
from bot import config_dict
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.bot_utils import async_to_sync, SetInterval
|
||||
from ...ext_utils.files_utils import get_mime_type
|
||||
from ...mirror_leech_utils.gdrive_utils.helper import GoogleDriveHelper
|
||||
@ -161,7 +161,7 @@ class GoogleDriveUpload(GoogleDriveHelper):
|
||||
)
|
||||
.execute()
|
||||
)
|
||||
if not config_dict["IS_TEAM_DRIVE"]:
|
||||
if not Config.IS_TEAM_DRIVE:
|
||||
self.set_permission(response["id"])
|
||||
|
||||
drive_file = (
|
||||
@ -227,7 +227,7 @@ class GoogleDriveUpload(GoogleDriveHelper):
|
||||
pass
|
||||
self.file_processed_bytes = 0
|
||||
# Insert new permissions
|
||||
if not config_dict["IS_TEAM_DRIVE"]:
|
||||
if not Config.IS_TEAM_DRIVE:
|
||||
self.set_permission(response["id"])
|
||||
# Define file instance and get url for download
|
||||
if not in_dir:
|
||||
|
@ -8,7 +8,8 @@ from pyrogram.filters import regex, user
|
||||
from pyrogram.handlers import CallbackQueryHandler
|
||||
from time import time
|
||||
|
||||
from bot import LOGGER, config_dict
|
||||
from .... import LOGGER
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.bot_utils import cmd_exec, update_user_ldata, new_task
|
||||
from ...ext_utils.db_handler import database
|
||||
from ...ext_utils.status_utils import get_readable_file_size, get_readable_time
|
||||
@ -115,7 +116,7 @@ async def path_updates(_, query, obj):
|
||||
if path != obj.listener.user_dict.get("rclone_path"):
|
||||
update_user_ldata(obj.listener.user_id, "rclone_path", path)
|
||||
await obj.get_path_buttons()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
if Config.DATABASE_URL:
|
||||
await database.update_user_data(obj.listener.user_id)
|
||||
elif data[1] == "owner":
|
||||
obj.config_path = "rclone.conf"
|
||||
@ -241,7 +242,7 @@ class RcloneList:
|
||||
else "\nTransfer Type: <i>Upload</i>"
|
||||
)
|
||||
if self.list_status == "rcu":
|
||||
default_path = config_dict["RCLONE_PATH"]
|
||||
default_path = Config.RCLONE_PATH
|
||||
msg += f"\nDefault Rclone Path: {default_path}" if default_path else ""
|
||||
msg += f"\n\nItems: {items_no}"
|
||||
if items_no > LIST_LIMIT:
|
||||
@ -253,9 +254,9 @@ class RcloneList:
|
||||
|
||||
async def get_path(self, itype=""):
|
||||
if itype:
|
||||
self.item_type == itype
|
||||
self.item_type = itype
|
||||
elif self.list_status == "rcu":
|
||||
self.item_type == "--dirs-only"
|
||||
self.item_type = "--dirs-only"
|
||||
cmd = [
|
||||
"rclone",
|
||||
"lsjson",
|
||||
@ -266,11 +267,34 @@ class RcloneList:
|
||||
"--config",
|
||||
self.config_path,
|
||||
f"{self.remote}{self.path}",
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
if self.listener.is_cancelled:
|
||||
return
|
||||
res, err, code = await cmd_exec(cmd)
|
||||
if code not in [0, -9]:
|
||||
if code in [0, -9]:
|
||||
result = loads(res)
|
||||
if (
|
||||
len(result) == 0
|
||||
and itype != self.item_type
|
||||
and self.list_status == "rcd"
|
||||
):
|
||||
itype = (
|
||||
"--dirs-only"
|
||||
if self.item_type == "--files-only"
|
||||
else "--files-only"
|
||||
)
|
||||
self.item_type = itype
|
||||
await self.get_path(itype)
|
||||
else:
|
||||
self.path_list = sorted(result, key=lambda x: x["Path"])
|
||||
self.iter_start = 0
|
||||
await self.get_path_buttons()
|
||||
else:
|
||||
if not err:
|
||||
err = "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
LOGGER.error(
|
||||
@ -279,17 +303,6 @@ class RcloneList:
|
||||
self.remote = err[:4000]
|
||||
self.path = ""
|
||||
self.event.set()
|
||||
return
|
||||
result = loads(res)
|
||||
if len(result) == 0 and itype != self.item_type and self.list_status == "rcd":
|
||||
itype = (
|
||||
"--dirs-only" if self.item_type == "--files-only" else "--files-only"
|
||||
)
|
||||
self.item_type = itype
|
||||
return await self.get_path(itype)
|
||||
self.path_list = sorted(result, key=lambda x: x["Path"])
|
||||
self.iter_start = 0
|
||||
await self.get_path_buttons()
|
||||
|
||||
async def list_remotes(self):
|
||||
config = RawConfigParser()
|
||||
|
@ -3,13 +3,13 @@ from aiofiles.os import path as aiopath
|
||||
from asyncio import create_subprocess_exec
|
||||
from configparser import RawConfigParser
|
||||
|
||||
from bot import config_dict
|
||||
from ....core.config_manager import Config
|
||||
|
||||
RcloneServe = []
|
||||
|
||||
|
||||
async def rclone_serve_booter():
|
||||
if not config_dict["RCLONE_SERVE_URL"] or not await aiopath.exists("rclone.conf"):
|
||||
if not Config.RCLONE_SERVE_URL or not await aiopath.exists("rclone.conf"):
|
||||
if RcloneServe:
|
||||
try:
|
||||
RcloneServe[0].kill()
|
||||
@ -43,17 +43,20 @@ async def rclone_serve_booter():
|
||||
"--no-modtime",
|
||||
"combine:",
|
||||
"--addr",
|
||||
f":{config_dict['RCLONE_SERVE_PORT']}",
|
||||
f":{Config.RCLONE_SERVE_PORT}",
|
||||
"--vfs-cache-mode",
|
||||
"full",
|
||||
"--vfs-cache-max-age",
|
||||
"1m0s",
|
||||
"--buffer-size",
|
||||
"64M",
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
if (user := config_dict["RCLONE_SERVE_USER"]) and (
|
||||
pswd := config_dict["RCLONE_SERVE_PASS"]
|
||||
):
|
||||
if (user := Config.RCLONE_SERVE_USER) and (pswd := Config.RCLONE_SERVE_PASS):
|
||||
cmd.extend(("--user", user, "--pass", pswd))
|
||||
rcs = await create_subprocess_exec(*cmd)
|
||||
RcloneServe.append(rcs)
|
||||
|
@ -8,7 +8,7 @@ from logging import getLogger
|
||||
from random import randrange
|
||||
from re import findall as re_findall
|
||||
|
||||
from bot import config_dict
|
||||
from ....core.config_manager import Config
|
||||
from ...ext_utils.bot_utils import cmd_exec, sync_to_async
|
||||
from ...ext_utils.files_utils import (
|
||||
get_mime_type,
|
||||
@ -33,7 +33,7 @@ class RcloneTransferHelper:
|
||||
self._sa_count = 1
|
||||
self._sa_index = 0
|
||||
self._sa_number = 0
|
||||
self._use_service_accounts = config_dict["USE_SERVICE_ACCOUNTS"]
|
||||
self._use_service_accounts = Config.USE_SERVICE_ACCOUNTS
|
||||
self.rclone_select = False
|
||||
|
||||
@property
|
||||
@ -121,7 +121,9 @@ class RcloneTransferHelper:
|
||||
if return_code == 0:
|
||||
await self._listener.on_download_complete()
|
||||
elif return_code != -9:
|
||||
error = (await self._proc.stderr.read()).decode().strip() or "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
error = (
|
||||
await self._proc.stderr.read()
|
||||
).decode().strip() or "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
if not error and remote_type == "drive" and self._use_service_accounts:
|
||||
error = "Mostly your service accounts don't have access to this drive!"
|
||||
elif not error:
|
||||
@ -177,7 +179,7 @@ class RcloneTransferHelper:
|
||||
|
||||
if (
|
||||
remote_type == "drive"
|
||||
and not config_dict["RCLONE_FLAGS"]
|
||||
and not Config.RCLONE_FLAGS
|
||||
and not self._listener.rc_flags
|
||||
):
|
||||
cmd.append("--drive-acknowledge-abuse")
|
||||
@ -207,6 +209,11 @@ class RcloneTransferHelper:
|
||||
"--config",
|
||||
config_path,
|
||||
epath,
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
res, err, code = await cmd_exec(cmd)
|
||||
|
||||
@ -239,7 +246,9 @@ class RcloneTransferHelper:
|
||||
if return_code == -9:
|
||||
return False
|
||||
elif return_code != 0:
|
||||
error = (await self._proc.stderr.read()).decode().strip() or "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
error = (
|
||||
await self._proc.stderr.read()
|
||||
).decode().strip() or "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
if not error and remote_type == "drive" and self._use_service_accounts:
|
||||
error = "Mostly your service accounts don't have access to this drive or RATE_LIMIT_EXCEEDED"
|
||||
elif not error:
|
||||
@ -325,7 +334,7 @@ class RcloneTransferHelper:
|
||||
)
|
||||
if (
|
||||
remote_type == "drive"
|
||||
and not config_dict["RCLONE_FLAGS"]
|
||||
and not Config.RCLONE_FLAGS
|
||||
and not self._listener.rc_flags
|
||||
):
|
||||
cmd.extend(("--drive-chunk-size", "128M", "--drive-upload-cutoff", "128M"))
|
||||
@ -346,7 +355,18 @@ class RcloneTransferHelper:
|
||||
else:
|
||||
destination = f"{oremote}:{self._listener.name}"
|
||||
|
||||
cmd = ["rclone", "link", "--config", oconfig_path, destination]
|
||||
cmd = [
|
||||
"rclone",
|
||||
"link",
|
||||
"--config",
|
||||
oconfig_path,
|
||||
destination,
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
res, err, code = await cmd_exec(cmd)
|
||||
|
||||
if code == 0:
|
||||
@ -386,7 +406,7 @@ class RcloneTransferHelper:
|
||||
cmd = self._get_updated_command(
|
||||
config_path, f"{src_remote}:{src_path}", destination, method
|
||||
)
|
||||
if not self._listener.rc_flags and not config_dict["RCLONE_FLAGS"]:
|
||||
if not self._listener.rc_flags and not Config.RCLONE_FLAGS:
|
||||
if src_remote_type == "drive" and dst_remote_type != "drive":
|
||||
cmd.append("--drive-acknowledge-abuse")
|
||||
elif src_remote_type == "drive":
|
||||
@ -401,7 +421,9 @@ class RcloneTransferHelper:
|
||||
if return_code == -9:
|
||||
return None, None
|
||||
elif return_code != 0:
|
||||
error = (await self._proc.stderr.read()).decode().strip() or "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
error = (
|
||||
await self._proc.stderr.read()
|
||||
).decode().strip() or "Use <code>/shell cat rlog.txt</code> to see more information"
|
||||
LOGGER.error(error)
|
||||
await self._listener.on_upload_error(error[:4000])
|
||||
return None, None
|
||||
@ -419,7 +441,18 @@ class RcloneTransferHelper:
|
||||
f"/{self._listener.name}" if dst_path else self._listener.name
|
||||
)
|
||||
|
||||
cmd = ["rclone", "link", "--config", config_path, destination]
|
||||
cmd = [
|
||||
"rclone",
|
||||
"link",
|
||||
"--config",
|
||||
config_path,
|
||||
destination,
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
res, err, code = await cmd_exec(cmd)
|
||||
|
||||
if self._listener.is_cancelled:
|
||||
@ -460,16 +493,17 @@ class RcloneTransferHelper:
|
||||
"--low-level-retries",
|
||||
"1",
|
||||
"-M",
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"DEBUG",
|
||||
"ERROR",
|
||||
]
|
||||
if self.rclone_select:
|
||||
cmd.extend(("--files-from", self._listener.link))
|
||||
else:
|
||||
cmd.extend(("--exclude", ext))
|
||||
if rcflags := self._listener.rc_flags or config_dict["RCLONE_FLAGS"]:
|
||||
if rcflags := self._listener.rc_flags or Config.RCLONE_FLAGS:
|
||||
rcflags = rcflags.split("|")
|
||||
for flag in rcflags:
|
||||
if ":" in flag:
|
||||
|
@ -1,6 +1,6 @@
|
||||
from time import time
|
||||
|
||||
from bot import aria2, LOGGER
|
||||
from .... import aria2, LOGGER
|
||||
from ...ext_utils.bot_utils import sync_to_async
|
||||
from ...ext_utils.status_utils import MirrorStatus, get_readable_time
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||
from bot import LOGGER, subprocess_lock
|
||||
from .... import LOGGER, subprocess_lock
|
||||
from ...ext_utils.status_utils import get_readable_file_size, MirrorStatus
|
||||
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||
from bot.helper.ext_utils.status_utils import (
|
||||
from ....helper.ext_utils.status_utils import (
|
||||
MirrorStatus,
|
||||
get_readable_file_size,
|
||||
get_readable_time,
|
||||
|
@ -1,4 +1,4 @@
|
||||
from bot import LOGGER, jd_lock, jd_downloads
|
||||
from .... import LOGGER, jd_lock, jd_downloads
|
||||
from ...ext_utils.bot_utils import async_to_sync
|
||||
from ...ext_utils.jdownloader_booter import jdownloader
|
||||
from ...ext_utils.status_utils import (
|
||||
|
@ -1,6 +1,6 @@
|
||||
from asyncio import gather
|
||||
|
||||
from bot import LOGGER, sabnzbd_client, nzb_jobs, nzb_listener_lock
|
||||
from .... import LOGGER, sabnzbd_client, nzb_jobs, nzb_listener_lock
|
||||
from ...ext_utils.bot_utils import async_to_sync
|
||||
from ...ext_utils.status_utils import (
|
||||
MirrorStatus,
|
||||
|
@ -1,6 +1,6 @@
|
||||
from asyncio import sleep, gather
|
||||
|
||||
from bot import LOGGER, qbittorrent_client, qb_torrents, qb_listener_lock
|
||||
from .... import LOGGER, qbittorrent_client, qb_torrents, qb_listener_lock
|
||||
from ...ext_utils.bot_utils import sync_to_async
|
||||
from ...ext_utils.status_utils import (
|
||||
MirrorStatus,
|
||||
|
@ -1,4 +1,4 @@
|
||||
from bot import LOGGER
|
||||
from .... import LOGGER
|
||||
from ...ext_utils.status_utils import get_readable_file_size, MirrorStatus
|
||||
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
from time import time
|
||||
|
||||
from bot import LOGGER, subprocess_lock
|
||||
from .... import LOGGER, subprocess_lock
|
||||
from ...ext_utils.files_utils import get_path_size
|
||||
from ...ext_utils.status_utils import (
|
||||
get_readable_file_size,
|
||||
|
@ -6,7 +6,7 @@ from natsort import natsorted
|
||||
from os import walk, path as ospath
|
||||
from time import time
|
||||
from re import match as re_match, sub as re_sub
|
||||
from pyrogram.errors import FloodWait, RPCError, FloodPremiumWait
|
||||
from pyrogram.errors import FloodWait, RPCError, FloodPremiumWait, BadRequest
|
||||
from aiofiles.os import (
|
||||
remove,
|
||||
path as aiopath,
|
||||
@ -26,7 +26,8 @@ from tenacity import (
|
||||
RetryError,
|
||||
)
|
||||
|
||||
from bot import config_dict, user
|
||||
from ...core.config_manager import Config
|
||||
from ...core.mltb_client import TgClient
|
||||
from ..ext_utils.bot_utils import sync_to_async
|
||||
from ..ext_utils.files_utils import clean_unwanted, is_archive, get_base_name
|
||||
from ..telegram_helper.message_utils import delete_message
|
||||
@ -61,11 +62,12 @@ class TelegramUploader:
|
||||
self._is_private = False
|
||||
self._sent_msg = None
|
||||
self._user_session = self._listener.user_transmission
|
||||
self._error = ""
|
||||
|
||||
async def _upload_progress(self, current, _):
|
||||
if self._listener.is_cancelled:
|
||||
if self._user_session:
|
||||
user.stop_transmission()
|
||||
TgClient.user.stop_transmission()
|
||||
else:
|
||||
self._listener.client.stop_transmission()
|
||||
chunk_size = current - self._last_uploaded
|
||||
@ -74,12 +76,12 @@ class TelegramUploader:
|
||||
|
||||
async def _user_settings(self):
|
||||
self._media_group = self._listener.user_dict.get("media_group") or (
|
||||
config_dict["MEDIA_GROUP"]
|
||||
Config.MEDIA_GROUP
|
||||
if "media_group" not in self._listener.user_dict
|
||||
else False
|
||||
)
|
||||
self._lprefix = self._listener.user_dict.get("lprefix") or (
|
||||
config_dict["LEECH_FILENAME_PREFIX"]
|
||||
Config.LEECH_FILENAME_PREFIX
|
||||
if "lprefix" not in self._listener.user_dict
|
||||
else ""
|
||||
)
|
||||
@ -95,7 +97,7 @@ class TelegramUploader:
|
||||
)
|
||||
try:
|
||||
if self._user_session:
|
||||
self._sent_msg = await user.send_message(
|
||||
self._sent_msg = await TgClient.user.send_message(
|
||||
chat_id=self._listener.up_dest,
|
||||
text=msg,
|
||||
disable_web_page_preview=True,
|
||||
@ -115,11 +117,11 @@ class TelegramUploader:
|
||||
await self._listener.on_upload_error(str(e))
|
||||
return False
|
||||
elif self._user_session:
|
||||
self._sent_msg = await user.get_messages(
|
||||
self._sent_msg = await TgClient.user.get_messages(
|
||||
chat_id=self._listener.message.chat.id, message_ids=self._listener.mid
|
||||
)
|
||||
if self._sent_msg is None:
|
||||
self._sent_msg = await user.send_message(
|
||||
self._sent_msg = await TgClient.user.send_message(
|
||||
chat_id=self._listener.message.chat.id,
|
||||
text="Deleted Cmd Message! Don't delete the cmd message again!",
|
||||
disable_web_page_preview=True,
|
||||
@ -217,7 +219,7 @@ class TelegramUploader:
|
||||
chat_id=msg[0], message_ids=msg[1]
|
||||
)
|
||||
else:
|
||||
msgs[index] = await user.get_messages(
|
||||
msgs[index] = await TgClient.user.get_messages(
|
||||
chat_id=msg[0], message_ids=msg[1]
|
||||
)
|
||||
msgs_list = await msgs[0].reply_to_message.reply_media_group(
|
||||
@ -249,6 +251,7 @@ class TelegramUploader:
|
||||
continue
|
||||
for file_ in natsorted(files):
|
||||
delete_file = False
|
||||
self._error = ""
|
||||
self._up_path = f_path = ospath.join(dirpath, file_)
|
||||
if self._up_path in ft_delete:
|
||||
delete_file = True
|
||||
@ -280,10 +283,10 @@ class TelegramUploader:
|
||||
for subkey, msgs in list(value.items()):
|
||||
if len(msgs) > 1:
|
||||
await self._send_media_group(subkey, key, msgs)
|
||||
if self._listener.mixed_leech:
|
||||
if self._listener.mixed_leech and self._listener.user_transmission:
|
||||
self._user_session = f_size > 2097152000
|
||||
if self._user_session:
|
||||
self._sent_msg = await user.get_messages(
|
||||
self._sent_msg = await TgClient.user.get_messages(
|
||||
chat_id=self._sent_msg.chat.id,
|
||||
message_ids=self._sent_msg.id,
|
||||
)
|
||||
@ -311,6 +314,7 @@ class TelegramUploader:
|
||||
)
|
||||
err = err.last_attempt.exception()
|
||||
LOGGER.error(f"{err}. Path: {self._up_path}")
|
||||
self._error = str(err)
|
||||
self._corrupted += 1
|
||||
if self._listener.is_cancelled:
|
||||
return
|
||||
@ -345,9 +349,7 @@ class TelegramUploader:
|
||||
)
|
||||
return
|
||||
if self._total_files <= self._corrupted:
|
||||
await self._listener.on_upload_error(
|
||||
"Files Corrupted or unable to upload. Check logs!"
|
||||
)
|
||||
await self._listener.on_upload_error(f"Files Corrupted or unable to upload. {self._error or 'Check logs!'}")
|
||||
return
|
||||
LOGGER.info(f"Leech Completed: {self._listener.name}")
|
||||
await self._listener.on_upload_complete(
|
||||
@ -501,7 +503,7 @@ class TelegramUploader:
|
||||
await remove(thumb)
|
||||
err_type = "RPCError: " if isinstance(err, RPCError) else ""
|
||||
LOGGER.error(f"{err_type}{err}. Path: {self._up_path}")
|
||||
if "Telegram says: [400" in str(err) and key != "documents":
|
||||
if isinstance(err, BadRequest) and key != "documents":
|
||||
LOGGER.error(f"Retrying As Document. Path: {self._up_path}")
|
||||
return await self._upload_file(cap_mono, file, o_path, True)
|
||||
raise err
|
||||
|
@ -1,46 +1,46 @@
|
||||
from bot import CMD_SUFFIX
|
||||
from ...core.config_manager import Config
|
||||
|
||||
|
||||
class _BotCommands:
|
||||
def __init__(self):
|
||||
self.StartCommand = f"start{CMD_SUFFIX}"
|
||||
self.MirrorCommand = [f"mirror{CMD_SUFFIX}", f"m{CMD_SUFFIX}"]
|
||||
self.QbMirrorCommand = [f"qbmirror{CMD_SUFFIX}", f"qm{CMD_SUFFIX}"]
|
||||
self.JdMirrorCommand = [f"jdmirror{CMD_SUFFIX}", f"jm{CMD_SUFFIX}"]
|
||||
self.YtdlCommand = [f"ytdl{CMD_SUFFIX}", f"y{CMD_SUFFIX}"]
|
||||
self.NzbMirrorCommand = [f"nzbmirror{CMD_SUFFIX}", f"nm{CMD_SUFFIX}"]
|
||||
self.LeechCommand = [f"leech{CMD_SUFFIX}", f"l{CMD_SUFFIX}"]
|
||||
self.QbLeechCommand = [f"qbleech{CMD_SUFFIX}", f"ql{CMD_SUFFIX}"]
|
||||
self.JdLeechCommand = [f"jdLeech{CMD_SUFFIX}", f"jl{CMD_SUFFIX}"]
|
||||
self.YtdlLeechCommand = [f"ytdlleech{CMD_SUFFIX}", f"yl{CMD_SUFFIX}"]
|
||||
self.NzbLeechCommand = [f"nzbleech{CMD_SUFFIX}", f"nl{CMD_SUFFIX}"]
|
||||
self.CloneCommand = f"clone{CMD_SUFFIX}"
|
||||
self.CountCommand = f"count{CMD_SUFFIX}"
|
||||
self.DeleteCommand = f"del{CMD_SUFFIX}"
|
||||
self.CancelTaskCommand = [f"cancel{CMD_SUFFIX}", f"c{CMD_SUFFIX}"]
|
||||
self.CancelAllCommand = f"cancelall{CMD_SUFFIX}"
|
||||
self.ForceStartCommand = [f"forcestart{CMD_SUFFIX}", f"fs{CMD_SUFFIX}"]
|
||||
self.ListCommand = f"list{CMD_SUFFIX}"
|
||||
self.SearchCommand = f"search{CMD_SUFFIX}"
|
||||
self.StatusCommand = f"status{CMD_SUFFIX}"
|
||||
self.UsersCommand = f"users{CMD_SUFFIX}"
|
||||
self.AuthorizeCommand = f"authorize{CMD_SUFFIX}"
|
||||
self.UnAuthorizeCommand = f"unauthorize{CMD_SUFFIX}"
|
||||
self.AddSudoCommand = f"addsudo{CMD_SUFFIX}"
|
||||
self.RmSudoCommand = f"rmsudo{CMD_SUFFIX}"
|
||||
self.PingCommand = f"ping{CMD_SUFFIX}"
|
||||
self.RestartCommand = f"restart{CMD_SUFFIX}"
|
||||
self.StatsCommand = f"stats{CMD_SUFFIX}"
|
||||
self.HelpCommand = f"help{CMD_SUFFIX}"
|
||||
self.LogCommand = f"log{CMD_SUFFIX}"
|
||||
self.ShellCommand = f"shell{CMD_SUFFIX}"
|
||||
self.AExecCommand = f"aexec{CMD_SUFFIX}"
|
||||
self.ExecCommand = f"exec{CMD_SUFFIX}"
|
||||
self.ClearLocalsCommand = f"clearlocals{CMD_SUFFIX}"
|
||||
self.BotSetCommand = [f"bsetting{CMD_SUFFIX}", f"bs{CMD_SUFFIX}"]
|
||||
self.UserSetCommand = [f"usetting{CMD_SUFFIX}", f"us{CMD_SUFFIX}"]
|
||||
self.SelectCommand = f"sel{CMD_SUFFIX}"
|
||||
self.RssCommand = f"rss{CMD_SUFFIX}"
|
||||
self.StartCommand = f"start{Config.CMD_SUFFIX}"
|
||||
self.MirrorCommand = [f"mirror{Config.CMD_SUFFIX}", f"m{Config.CMD_SUFFIX}"]
|
||||
self.QbMirrorCommand = [f"qbmirror{Config.CMD_SUFFIX}", f"qm{Config.CMD_SUFFIX}"]
|
||||
self.JdMirrorCommand = [f"jdmirror{Config.CMD_SUFFIX}", f"jm{Config.CMD_SUFFIX}"]
|
||||
self.YtdlCommand = [f"ytdl{Config.CMD_SUFFIX}", f"y{Config.CMD_SUFFIX}"]
|
||||
self.NzbMirrorCommand = [f"nzbmirror{Config.CMD_SUFFIX}", f"nm{Config.CMD_SUFFIX}"]
|
||||
self.LeechCommand = [f"leech{Config.CMD_SUFFIX}", f"l{Config.CMD_SUFFIX}"]
|
||||
self.QbLeechCommand = [f"qbleech{Config.CMD_SUFFIX}", f"ql{Config.CMD_SUFFIX}"]
|
||||
self.JdLeechCommand = [f"jdLeech{Config.CMD_SUFFIX}", f"jl{Config.CMD_SUFFIX}"]
|
||||
self.YtdlLeechCommand = [f"ytdlleech{Config.CMD_SUFFIX}", f"yl{Config.CMD_SUFFIX}"]
|
||||
self.NzbLeechCommand = [f"nzbleech{Config.CMD_SUFFIX}", f"nl{Config.CMD_SUFFIX}"]
|
||||
self.CloneCommand = f"clone{Config.CMD_SUFFIX}"
|
||||
self.CountCommand = f"count{Config.CMD_SUFFIX}"
|
||||
self.DeleteCommand = f"del{Config.CMD_SUFFIX}"
|
||||
self.CancelTaskCommand = [f"cancel{Config.CMD_SUFFIX}", f"c{Config.CMD_SUFFIX}"]
|
||||
self.CancelAllCommand = f"cancelall{Config.CMD_SUFFIX}"
|
||||
self.ForceStartCommand = [f"forcestart{Config.CMD_SUFFIX}", f"fs{Config.CMD_SUFFIX}"]
|
||||
self.ListCommand = f"list{Config.CMD_SUFFIX}"
|
||||
self.SearchCommand = f"search{Config.CMD_SUFFIX}"
|
||||
self.StatusCommand = f"status{Config.CMD_SUFFIX}"
|
||||
self.UsersCommand = f"users{Config.CMD_SUFFIX}"
|
||||
self.AuthorizeCommand = f"authorize{Config.CMD_SUFFIX}"
|
||||
self.UnAuthorizeCommand = f"unauthorize{Config.CMD_SUFFIX}"
|
||||
self.AddSudoCommand = f"addsudo{Config.CMD_SUFFIX}"
|
||||
self.RmSudoCommand = f"rmsudo{Config.CMD_SUFFIX}"
|
||||
self.PingCommand = f"ping{Config.CMD_SUFFIX}"
|
||||
self.RestartCommand = f"restart{Config.CMD_SUFFIX}"
|
||||
self.StatsCommand = f"stats{Config.CMD_SUFFIX}"
|
||||
self.HelpCommand = f"help{Config.CMD_SUFFIX}"
|
||||
self.LogCommand = f"log{Config.CMD_SUFFIX}"
|
||||
self.ShellCommand = f"shell{Config.CMD_SUFFIX}"
|
||||
self.AExecCommand = f"aexec{Config.CMD_SUFFIX}"
|
||||
self.ExecCommand = f"exec{Config.CMD_SUFFIX}"
|
||||
self.ClearLocalsCommand = f"clearlocals{Config.CMD_SUFFIX}"
|
||||
self.BotSetCommand = [f"bsetting{Config.CMD_SUFFIX}", f"bs{Config.CMD_SUFFIX}"]
|
||||
self.UserSetCommand = [f"usetting{Config.CMD_SUFFIX}", f"us{Config.CMD_SUFFIX}"]
|
||||
self.SelectCommand = f"sel{Config.CMD_SUFFIX}"
|
||||
self.RssCommand = f"rss{Config.CMD_SUFFIX}"
|
||||
|
||||
|
||||
BotCommands = _BotCommands()
|
||||
|
@ -1,13 +1,14 @@
|
||||
from pyrogram.filters import create
|
||||
|
||||
from bot import user_data, OWNER_ID
|
||||
from ... import user_data
|
||||
from ...core.config_manager import Config
|
||||
|
||||
|
||||
class CustomFilters:
|
||||
async def owner_filter(self, _, update):
|
||||
user = update.from_user or update.sender_chat
|
||||
uid = user.id
|
||||
return uid == OWNER_ID
|
||||
return uid == Config.OWNER_ID
|
||||
|
||||
owner = create(owner_filter)
|
||||
|
||||
@ -17,7 +18,7 @@ class CustomFilters:
|
||||
chat_id = update.chat.id
|
||||
thread_id = update.message_thread_id if update.is_topic_message else None
|
||||
return bool(
|
||||
uid == OWNER_ID
|
||||
uid == Config.OWNER_ID
|
||||
or (
|
||||
uid in user_data
|
||||
and (
|
||||
@ -41,7 +42,7 @@ class CustomFilters:
|
||||
user = update.from_user or update.sender_chat
|
||||
uid = user.id
|
||||
return bool(
|
||||
uid == OWNER_ID or uid in user_data and user_data[uid].get("is_sudo")
|
||||
uid == Config.OWNER_ID or uid in user_data and user_data[uid].get("is_sudo")
|
||||
)
|
||||
|
||||
sudo = create(sudo_user)
|
||||
|
@ -3,13 +3,15 @@ from pyrogram.errors import FloodWait, FloodPremiumWait
|
||||
from re import match as re_match
|
||||
from time import time
|
||||
|
||||
from bot import config_dict, LOGGER, status_dict, task_dict_lock, intervals, bot, user
|
||||
from ... import LOGGER, status_dict, task_dict_lock, intervals
|
||||
from ...core.config_manager import Config
|
||||
from ...core.mltb_client import TgClient
|
||||
from ..ext_utils.bot_utils import SetInterval
|
||||
from ..ext_utils.exceptions import TgLinkException
|
||||
from ..ext_utils.status_utils import get_readable_message
|
||||
|
||||
|
||||
async def send_message(message, text, buttons=None, block=True):
|
||||
async def send_message(message, text, buttons=None):
|
||||
try:
|
||||
return await message.reply(
|
||||
text=text,
|
||||
@ -20,16 +22,14 @@ async def send_message(message, text, buttons=None, block=True):
|
||||
)
|
||||
except FloodWait as f:
|
||||
LOGGER.warning(str(f))
|
||||
if block:
|
||||
await sleep(f.value * 1.2)
|
||||
return await send_message(message, text, buttons)
|
||||
return str(f)
|
||||
await sleep(f.value * 1.2)
|
||||
return await send_message(message, text, buttons)
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
return str(e)
|
||||
|
||||
|
||||
async def edit_message(message, text, buttons=None, block=True):
|
||||
async def edit_message(message, text, buttons=None):
|
||||
try:
|
||||
return await message.edit(
|
||||
text=text,
|
||||
@ -38,10 +38,8 @@ async def edit_message(message, text, buttons=None, block=True):
|
||||
)
|
||||
except FloodWait as f:
|
||||
LOGGER.warning(str(f))
|
||||
if block:
|
||||
await sleep(f.value * 1.2)
|
||||
return await edit_message(message, text, buttons)
|
||||
return str(f)
|
||||
await sleep(f.value * 1.2)
|
||||
return await edit_message(message, text, buttons)
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
return str(e)
|
||||
@ -63,7 +61,7 @@ async def send_file(message, file, caption=""):
|
||||
|
||||
async def send_rss(text, chat_id, thread_id):
|
||||
try:
|
||||
app = user or bot
|
||||
app = TgClient.user or TgClient.bot
|
||||
return await app.send_message(
|
||||
chat_id=chat_id,
|
||||
text=text,
|
||||
@ -118,7 +116,7 @@ async def get_tg_link_message(link):
|
||||
msg = re_match(
|
||||
r"tg:\/\/openmessage\?user_id=([0-9]+)&message_id=([0-9-]+)", link
|
||||
)
|
||||
if not user:
|
||||
if not TgClient.user:
|
||||
raise TgLinkException("USER_SESSION_STRING required for this private link!")
|
||||
|
||||
chat = msg[1]
|
||||
@ -148,19 +146,21 @@ async def get_tg_link_message(link):
|
||||
|
||||
if not private:
|
||||
try:
|
||||
message = await bot.get_messages(chat_id=chat, message_ids=msg_id)
|
||||
message = await TgClient.bot.get_messages(chat_id=chat, message_ids=msg_id)
|
||||
if message.empty:
|
||||
private = True
|
||||
except Exception as e:
|
||||
private = True
|
||||
if not user:
|
||||
if not TgClient.user:
|
||||
raise e
|
||||
|
||||
if not private:
|
||||
return (links, "bot") if links else (message, "bot")
|
||||
elif user:
|
||||
elif TgClient.user:
|
||||
try:
|
||||
user_message = await user.get_messages(chat_id=chat, message_ids=msg_id)
|
||||
user_message = await TgClient.user.get_messages(
|
||||
chat_id=chat, message_ids=msg_id
|
||||
)
|
||||
except Exception as e:
|
||||
raise TgLinkException(
|
||||
f"You don't have access to this chat!. ERROR: {e}"
|
||||
@ -169,6 +169,17 @@ async def get_tg_link_message(link):
|
||||
return (links, "user") if links else (user_message, "user")
|
||||
else:
|
||||
raise TgLinkException("Private: Please report!")
|
||||
|
||||
|
||||
async def check_permission(client, chat, uploader_id, up_dest):
|
||||
member = await chat.get_member(uploader_id)
|
||||
if (
|
||||
not member.privileges.can_manage_chat
|
||||
or not member.privileges.can_delete_messages
|
||||
):
|
||||
raise ValueError(
|
||||
"You don't have enough privileges in this chat!"
|
||||
)
|
||||
|
||||
|
||||
async def update_status_message(sid, force=False):
|
||||
@ -196,21 +207,22 @@ async def update_status_message(sid, force=False):
|
||||
obj.cancel()
|
||||
del intervals["status"][sid]
|
||||
return
|
||||
if text != status_dict[sid]["message"].text:
|
||||
message = await edit_message(
|
||||
status_dict[sid]["message"], text, buttons, block=False
|
||||
)
|
||||
if isinstance(message, str):
|
||||
if message.startswith("Telegram says: [400"):
|
||||
old_message = status_dict[sid]["message"]
|
||||
if text != old_message.text:
|
||||
message = await edit_message(old_message, text, buttons)
|
||||
if isinstance(message, str):
|
||||
if message.startswith("Telegram says: [40"):
|
||||
async with task_dict_lock:
|
||||
del status_dict[sid]
|
||||
if obj := intervals["status"].get(sid):
|
||||
obj.cancel()
|
||||
del intervals["status"][sid]
|
||||
else:
|
||||
LOGGER.error(
|
||||
f"Status with id: {sid} haven't been updated. Error: {message}"
|
||||
)
|
||||
return
|
||||
else:
|
||||
LOGGER.error(
|
||||
f"Status with id: {sid} haven't been updated. Error: {message}"
|
||||
)
|
||||
return
|
||||
async with task_dict_lock:
|
||||
status_dict[sid]["message"].text = text
|
||||
status_dict[sid]["time"] = time()
|
||||
|
||||
@ -218,10 +230,10 @@ async def update_status_message(sid, force=False):
|
||||
async def send_status_message(msg, user_id=0):
|
||||
if intervals["stopAll"]:
|
||||
return
|
||||
sid = user_id or msg.chat.id
|
||||
is_user = bool(user_id)
|
||||
async with task_dict_lock:
|
||||
sid = user_id or msg.chat.id
|
||||
is_user = bool(user_id)
|
||||
if sid in list(status_dict.keys()):
|
||||
if sid in status_dict:
|
||||
page_no = status_dict[sid]["page_no"]
|
||||
status = status_dict[sid]["status"]
|
||||
page_step = status_dict[sid]["page_step"]
|
||||
@ -236,7 +248,7 @@ async def send_status_message(msg, user_id=0):
|
||||
return
|
||||
message = status_dict[sid]["message"]
|
||||
await delete_message(message)
|
||||
message = await send_message(msg, text, buttons, block=False)
|
||||
message = await send_message(msg, text, buttons)
|
||||
if isinstance(message, str):
|
||||
LOGGER.error(
|
||||
f"Status with id: {sid} haven't been sent. Error: {message}"
|
||||
@ -248,7 +260,7 @@ async def send_status_message(msg, user_id=0):
|
||||
text, buttons = await get_readable_message(sid, is_user)
|
||||
if text is None:
|
||||
return
|
||||
message = await send_message(msg, text, buttons, block=False)
|
||||
message = await send_message(msg, text, buttons)
|
||||
if isinstance(message, str):
|
||||
LOGGER.error(
|
||||
f"Status with id: {sid} haven't been sent. Error: {message}"
|
||||
@ -263,7 +275,7 @@ async def send_status_message(msg, user_id=0):
|
||||
"status": "All",
|
||||
"is_user": is_user,
|
||||
}
|
||||
if not intervals["status"].get(sid) and not is_user:
|
||||
intervals["status"][sid] = SetInterval(
|
||||
config_dict["STATUS_UPDATE_INTERVAL"], update_status_message, sid
|
||||
)
|
||||
if not intervals["status"].get(sid) and not is_user:
|
||||
intervals["status"][sid] = SetInterval(
|
||||
Config.STATUS_UPDATE_INTERVAL, update_status_message, sid
|
||||
)
|
||||
|
@ -1 +1,84 @@
|
||||
from .bot_settings import send_bot_settings, edit_bot_settings
|
||||
from .cancel_task import cancel, cancel_multi, cancel_all_buttons, cancel_all_update
|
||||
from .chat_permission import authorize, unauthorize, add_sudo, remove_sudo
|
||||
from .clone import clone_node
|
||||
from .exec import aioexecute, execute, clear
|
||||
from .file_selector import select, confirm_selection
|
||||
from .force_start import remove_from_queue
|
||||
from .gd_count import count_node
|
||||
from .gd_delete import delete_file
|
||||
from .gd_search import gdrive_search, select_type
|
||||
from .help import arg_usage, bot_help
|
||||
from .mirror_leech import (
|
||||
mirror,
|
||||
leech,
|
||||
qb_leech,
|
||||
qb_mirror,
|
||||
jd_leech,
|
||||
jd_mirror,
|
||||
nzb_leech,
|
||||
nzb_mirror,
|
||||
)
|
||||
from .restart import restart_bot, restart_notification
|
||||
from .rss import get_rss_menu, rss_listener
|
||||
from .search import torrent_search, torrent_search_update, initiate_search_tools
|
||||
from .services import start, ping, log
|
||||
from .shell import run_shell
|
||||
from .stats import bot_stats, get_packages_version
|
||||
from .status import task_status, status_pages
|
||||
from .users_settings import get_users_settings, edit_user_settings, send_user_settings
|
||||
from .ytdlp import ytdl, ytdl_leech
|
||||
|
||||
__all__ = [
|
||||
"send_bot_settings",
|
||||
"edit_bot_settings",
|
||||
"cancel",
|
||||
"cancel_multi",
|
||||
"cancel_all_buttons",
|
||||
"cancel_all_update",
|
||||
"authorize",
|
||||
"unauthorize",
|
||||
"add_sudo",
|
||||
"remove_sudo",
|
||||
"clone_node",
|
||||
"aioexecute",
|
||||
"execute",
|
||||
"clear",
|
||||
"select",
|
||||
"confirm_selection",
|
||||
"remove_from_queue",
|
||||
"count_node",
|
||||
"delete_file",
|
||||
"gdrive_search",
|
||||
"select_type",
|
||||
"arg_usage",
|
||||
"mirror",
|
||||
"leech",
|
||||
"qb_leech",
|
||||
"qb_mirror",
|
||||
"jd_leech",
|
||||
"jd_mirror",
|
||||
"nzb_leech",
|
||||
"nzb_mirror",
|
||||
"restart_bot",
|
||||
"restart_notification",
|
||||
"get_rss_menu",
|
||||
"rss_listener",
|
||||
"torrent_search",
|
||||
"torrent_search_update",
|
||||
"initiate_search_tools",
|
||||
"start",
|
||||
"bot_help",
|
||||
"ping",
|
||||
"log",
|
||||
"run_shell",
|
||||
"bot_stats",
|
||||
"get_packages_version",
|
||||
"task_status",
|
||||
"status_pages",
|
||||
"get_users_settings",
|
||||
"edit_user_settings",
|
||||
"send_user_settings",
|
||||
"ytdl",
|
||||
"ytdl_leech",
|
||||
]
|
||||
|
@ -7,50 +7,42 @@ from asyncio import (
|
||||
sleep,
|
||||
gather,
|
||||
)
|
||||
from dotenv import load_dotenv
|
||||
from functools import partial
|
||||
from io import BytesIO
|
||||
from os import environ, getcwd
|
||||
from pyrogram.filters import command, regex, create
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
from os import getcwd
|
||||
from pyrogram.filters import create
|
||||
from pyrogram.handlers import MessageHandler
|
||||
from time import time
|
||||
|
||||
from bot import (
|
||||
MAX_SPLIT_SIZE,
|
||||
IS_PREMIUM_USER,
|
||||
from .. import (
|
||||
LOGGER,
|
||||
config_dict,
|
||||
user_data,
|
||||
drives_ids,
|
||||
drives_names,
|
||||
index_urls,
|
||||
aria2,
|
||||
global_extension_filter,
|
||||
intervals,
|
||||
aria2_options,
|
||||
aria2c_global,
|
||||
task_dict,
|
||||
qbit_options,
|
||||
qbittorrent_client,
|
||||
sabnzbd_client,
|
||||
bot,
|
||||
nzb_options,
|
||||
get_nzb_options,
|
||||
get_qb_options,
|
||||
jd_lock,
|
||||
extension_filter,
|
||||
)
|
||||
from ..helper.ext_utils.bot_utils import (
|
||||
SetInterval,
|
||||
sync_to_async,
|
||||
new_task,
|
||||
)
|
||||
from ..core.config_manager import Config
|
||||
from ..core.mltb_client import TgClient
|
||||
from ..core.startup import update_qb_options, update_nzb_options, update_variables
|
||||
from ..helper.ext_utils.db_handler import database
|
||||
from ..helper.ext_utils.jdownloader_booter import jdownloader
|
||||
from ..helper.ext_utils.task_manager import start_from_queued
|
||||
from ..helper.mirror_leech_utils.rclone_utils.serve import rclone_serve_booter
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import (
|
||||
send_message,
|
||||
send_file,
|
||||
@ -59,19 +51,19 @@ from ..helper.telegram_helper.message_utils import (
|
||||
delete_message,
|
||||
)
|
||||
from .rss import add_job
|
||||
from .torrent_search import initiate_search_tools
|
||||
from .search import initiate_search_tools
|
||||
|
||||
start = 0
|
||||
state = "view"
|
||||
handler_dict = {}
|
||||
DEFAULT_VALUES = {
|
||||
"DOWNLOAD_DIR": "/usr/src/app/downloads/",
|
||||
"LEECH_SPLIT_SIZE": MAX_SPLIT_SIZE,
|
||||
"LEECH_SPLIT_SIZE": TgClient.MAX_SPLIT_SIZE,
|
||||
"RSS_DELAY": 600,
|
||||
"STATUS_UPDATE_INTERVAL": 15,
|
||||
"SEARCH_LIMIT": 0,
|
||||
"UPSTREAM_BRANCH": "master",
|
||||
"DEFAULT_UPLOAD": "gd",
|
||||
"DEFAULT_UPLOAD": "rc",
|
||||
}
|
||||
|
||||
|
||||
@ -94,18 +86,18 @@ async def get_buttons(key=None, edit_type=None):
|
||||
buttons.data_button("Default", f"botset resetvar {key}")
|
||||
buttons.data_button("Close", "botset close")
|
||||
if key in [
|
||||
"SUDO_USERS",
|
||||
"CMD_SUFFIX",
|
||||
"OWNER_ID",
|
||||
"USER_SESSION_STRING",
|
||||
"TELEGRAM_HASH",
|
||||
"TELEGRAM_API",
|
||||
"AUTHORIZED_CHATS",
|
||||
"BOT_TOKEN",
|
||||
"DOWNLOAD_DIR",
|
||||
"SUDO_USERS",
|
||||
"AUTHORIZED_CHATS",
|
||||
]:
|
||||
msg += "Restart required for this edit to take effect!\n\n"
|
||||
msg += f"Send a valid value for {key}. Current value is '{config_dict[key]}'. Timeout: 60 sec"
|
||||
msg += "Restart required for this edit to take effect! You will not see the changes in bot vars, the edit will be in database only!\n\n"
|
||||
msg += f"Send a valid value for {key}. Current value is '{Config.get(key)}'. Timeout: 60 sec"
|
||||
elif edit_type == "ariavar":
|
||||
buttons.data_button("Back", "botset aria")
|
||||
if key != "newkey":
|
||||
@ -135,11 +127,30 @@ async def get_buttons(key=None, edit_type=None):
|
||||
buttons.data_button("Empty", f"botset emptyserkey {index} {key}")
|
||||
buttons.data_button("Close", "botset close")
|
||||
if key == "newser":
|
||||
msg = "Send one server as dictionary {}, like in config.env without []. Timeout: 60 sec"
|
||||
msg = "Send one server as dictionary {}, like in config.py without []. Timeout: 60 sec"
|
||||
else:
|
||||
msg = f"Send a valid value for {key} in server {config_dict['USENET_SERVERS'][index]['name']}. Current value is '{config_dict['USENET_SERVERS'][index][key]}. Timeout: 60 sec"
|
||||
msg = f"Send a valid value for {key} in server {Config.USENET_SERVERS[index]['name']}. Current value is {Config.USENET_SERVERS[index][key]}. Timeout: 60 sec"
|
||||
elif key == "var":
|
||||
for k in list(config_dict.keys())[start : 10 + start]:
|
||||
conf_dict = Config.get_all()
|
||||
for k in list(conf_dict.keys())[start : 10 + start]:
|
||||
if (
|
||||
key
|
||||
in [
|
||||
"CMD_SUFFIX",
|
||||
"OWNER_ID",
|
||||
"USER_SESSION_STRING",
|
||||
"TELEGRAM_HASH",
|
||||
"TELEGRAM_API",
|
||||
"BOT_TOKEN",
|
||||
"DOWNLOAD_DIR",
|
||||
"SUDO_USERS",
|
||||
"AUTHORIZED_CHATS",
|
||||
]
|
||||
and not Config.DATABASE_URL
|
||||
):
|
||||
continue
|
||||
if k == "DATABASE_URL" and state != "view":
|
||||
continue
|
||||
buttons.data_button(k, f"botset botvar {k}")
|
||||
if state == "view":
|
||||
buttons.data_button("Edit", "botset edit var")
|
||||
@ -147,7 +158,7 @@ async def get_buttons(key=None, edit_type=None):
|
||||
buttons.data_button("View", "botset view var")
|
||||
buttons.data_button("Back", "botset back")
|
||||
buttons.data_button("Close", "botset close")
|
||||
for x in range(0, len(config_dict), 10):
|
||||
for x in range(0, len(conf_dict), 10):
|
||||
buttons.data_button(
|
||||
f"{int(x / 10)}", f"botset start var {x}", position="footer"
|
||||
)
|
||||
@ -155,7 +166,7 @@ async def get_buttons(key=None, edit_type=None):
|
||||
elif key == "private":
|
||||
buttons.data_button("Back", "botset back")
|
||||
buttons.data_button("Close", "botset close")
|
||||
msg = """Send private file: config.env, token.pickle, rclone.conf, accounts.zip, list_drives.txt, cookies.txt, .netrc or any other private file!
|
||||
msg = """Send private file: config.py, token.pickle, rclone.conf, accounts.zip, list_drives.txt, cookies.txt, .netrc or any other private file!
|
||||
To delete private file send only the file name as text message.
|
||||
Note: Changing .netrc will not take effect for aria2c until restart.
|
||||
Timeout: 60 sec"""
|
||||
@ -206,23 +217,21 @@ Timeout: 60 sec"""
|
||||
)
|
||||
msg = f"Sabnzbd Options | Page: {int(start / 10)} | State: {state}"
|
||||
elif key == "nzbserver":
|
||||
if len(config_dict["USENET_SERVERS"]) > 0:
|
||||
for index, k in enumerate(
|
||||
config_dict["USENET_SERVERS"][start : 10 + start]
|
||||
):
|
||||
if len(Config.USENET_SERVERS) > 0:
|
||||
for index, k in enumerate(Config.USENET_SERVERS[start : 10 + start]):
|
||||
buttons.data_button(k["name"], f"botset nzbser{index}")
|
||||
buttons.data_button("Add New", "botset nzbsevar newser")
|
||||
buttons.data_button("Back", "botset nzb")
|
||||
buttons.data_button("Close", "botset close")
|
||||
if len(config_dict["USENET_SERVERS"]) > 10:
|
||||
for x in range(0, len(config_dict["USENET_SERVERS"]), 10):
|
||||
if len(Config.USENET_SERVERS) > 10:
|
||||
for x in range(0, len(Config.USENET_SERVERS), 10):
|
||||
buttons.data_button(
|
||||
f"{int(x / 10)}", f"botset start nzbser {x}", position="footer"
|
||||
)
|
||||
msg = f"Usenet Servers | Page: {int(start / 10)} | State: {state}"
|
||||
elif key.startswith("nzbser"):
|
||||
index = int(key.replace("nzbser", ""))
|
||||
for k in list(config_dict["USENET_SERVERS"][index].keys())[start : 10 + start]:
|
||||
for k in list(Config.USENET_SERVERS[index].keys())[start : 10 + start]:
|
||||
buttons.data_button(k, f"botset nzbsevar{index} {k}")
|
||||
if state == "view":
|
||||
buttons.data_button("Edit", f"botset edit {key}")
|
||||
@ -231,8 +240,8 @@ Timeout: 60 sec"""
|
||||
buttons.data_button("Remove Server", f"botset remser {index}")
|
||||
buttons.data_button("Back", "botset nzbserver")
|
||||
buttons.data_button("Close", "botset close")
|
||||
if len(config_dict["USENET_SERVERS"][index].keys()) > 10:
|
||||
for x in range(0, len(config_dict["USENET_SERVERS"][index]), 10):
|
||||
if len(Config.USENET_SERVERS[index].keys()) > 10:
|
||||
for x in range(0, len(Config.USENET_SERVERS[index]), 10):
|
||||
buttons.data_button(
|
||||
f"{int(x / 10)}", f"botset start {key} {x}", position="footer"
|
||||
)
|
||||
@ -255,7 +264,7 @@ async def edit_variable(_, message, pre_message, key):
|
||||
value = True
|
||||
elif value.lower() == "false":
|
||||
value = False
|
||||
if key == "INCOMPLETE_TASK_NOTIFIER" and config_dict["DATABASE_URL"]:
|
||||
if key == "INCOMPLETE_TASK_NOTIFIER" and Config.DATABASE_URL:
|
||||
await database.trunc_table("tasks")
|
||||
elif key == "DOWNLOAD_DIR":
|
||||
if not value.endswith("/"):
|
||||
@ -283,21 +292,21 @@ async def edit_variable(_, message, pre_message, key):
|
||||
LOGGER.error(e)
|
||||
aria2_options["bt-stop-timeout"] = f"{value}"
|
||||
elif key == "LEECH_SPLIT_SIZE":
|
||||
value = min(int(value), MAX_SPLIT_SIZE)
|
||||
value = min(int(value), TgClient.MAX_SPLIT_SIZE)
|
||||
elif key == "BASE_URL_PORT":
|
||||
value = int(value)
|
||||
if config_dict["BASE_URL"]:
|
||||
if Config.BASE_URL:
|
||||
await (await create_subprocess_exec("pkill", "-9", "-f", "gunicorn")).wait()
|
||||
await create_subprocess_shell(
|
||||
f"gunicorn web.wserver:app --bind 0.0.0.0:{value} --worker-class gevent"
|
||||
)
|
||||
elif key == "EXTENSION_FILTER":
|
||||
fx = value.split()
|
||||
global_extension_filter.clear()
|
||||
global_extension_filter.extend(["aria2", "!qB"])
|
||||
extension_filter.clear()
|
||||
extension_filter.extend(["aria2", "!qB"])
|
||||
for x in fx:
|
||||
x = x.lstrip(".")
|
||||
global_extension_filter.append(x.strip().lower())
|
||||
extension_filter.append(x.strip().lower())
|
||||
elif key == "GDRIVE_ID":
|
||||
if drives_names and drives_names[0] == "Main":
|
||||
drives_ids[0] = value
|
||||
@ -312,13 +321,21 @@ async def edit_variable(_, message, pre_message, key):
|
||||
value = int(value)
|
||||
elif value.startswith("[") and value.endswith("]"):
|
||||
value = eval(value)
|
||||
config_dict[key] = value
|
||||
if key not in [
|
||||
"CMD_SUFFIX",
|
||||
"OWNER_ID",
|
||||
"USER_SESSION_STRING",
|
||||
"TELEGRAM_HASH",
|
||||
"TELEGRAM_API",
|
||||
"BOT_TOKEN",
|
||||
"DOWNLOAD_DIR",
|
||||
"SUDO_USERS",
|
||||
"AUTHORIZED_CHATS",
|
||||
]:
|
||||
Config.set(key, value)
|
||||
await update_buttons(pre_message, "var")
|
||||
await delete_message(message)
|
||||
if key == "DATABASE_URL":
|
||||
await database.connect()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_config({key: value})
|
||||
await database.update_config({key: value})
|
||||
if key in ["SEARCH_PLUGINS", "SEARCH_API_LINK"]:
|
||||
await initiate_search_tools()
|
||||
elif key in ["QUEUE_ALL", "QUEUE_DOWNLOAD", "QUEUE_UPLOAD"]:
|
||||
@ -349,22 +366,20 @@ async def edit_aria(_, message, pre_message, key):
|
||||
value = "true"
|
||||
elif value.lower() == "false":
|
||||
value = "false"
|
||||
if key in aria2c_global:
|
||||
await sync_to_async(aria2.set_global_options, {key: value})
|
||||
else:
|
||||
downloads = await sync_to_async(aria2.get_downloads)
|
||||
for download in downloads:
|
||||
if not download.is_complete:
|
||||
try:
|
||||
await sync_to_async(
|
||||
aria2.client.change_option, download.gid, {key: value}
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
aria2_options[key] = value
|
||||
downloads = await sync_to_async(aria2.get_downloads)
|
||||
for download in downloads:
|
||||
if not download.is_complete:
|
||||
try:
|
||||
await sync_to_async(
|
||||
aria2.client.change_option, download.gid, {key: value}
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
await update_buttons(pre_message, "aria")
|
||||
await delete_message(message)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
if key not in ["checksum", "index-out", "out", "pause", "select-file"]:
|
||||
await sync_to_async(aria2.set_global_options, {key: value})
|
||||
aria2_options[key] = value
|
||||
await database.update_aria2(key, value)
|
||||
|
||||
|
||||
@ -384,8 +399,7 @@ async def edit_qbit(_, message, pre_message, key):
|
||||
qbit_options[key] = value
|
||||
await update_buttons(pre_message, "qbit")
|
||||
await delete_message(message)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_qbittorrent(key, value)
|
||||
await database.update_qbittorrent(key, value)
|
||||
|
||||
|
||||
@new_task
|
||||
@ -400,8 +414,7 @@ async def edit_nzb(_, message, pre_message, key):
|
||||
nzb_options[key] = res["config"]["misc"][key]
|
||||
await update_buttons(pre_message, "nzb")
|
||||
await delete_message(message)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_nzb_config()
|
||||
await database.update_nzb_config()
|
||||
|
||||
|
||||
@new_task
|
||||
@ -421,27 +434,26 @@ async def edit_nzb_server(_, message, pre_message, key, index=0):
|
||||
await send_message(message, "Invalid server!")
|
||||
await update_buttons(pre_message, "nzbserver")
|
||||
return
|
||||
config_dict["USENET_SERVERS"].append(value)
|
||||
Config.USENET_SERVERS.append(value)
|
||||
await update_buttons(pre_message, "nzbserver")
|
||||
elif key != "newser":
|
||||
if value.isdigit():
|
||||
value = int(value)
|
||||
res = await sabnzbd_client.add_server(
|
||||
{"name": config_dict["USENET_SERVERS"][index]["name"], key: value}
|
||||
{"name": Config.USENET_SERVERS[index]["name"], key: value}
|
||||
)
|
||||
if res["config"]["servers"][0][key] == "":
|
||||
await send_message(message, "Invalid value")
|
||||
return
|
||||
config_dict["USENET_SERVERS"][index][key] = value
|
||||
Config.USENET_SERVERS[index][key] = value
|
||||
await update_buttons(pre_message, f"nzbser{index}")
|
||||
await delete_message(message)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_config({"USENET_SERVERS": config_dict["USENET_SERVERS"]})
|
||||
await database.update_config({"USENET_SERVERS": Config.USENET_SERVERS})
|
||||
|
||||
|
||||
async def sync_jdownloader():
|
||||
async with jd_lock:
|
||||
if not config_dict["DATABASE_URL"] or not jdownloader.is_connected:
|
||||
if not Config.DATABASE_URL or not jdownloader.is_connected:
|
||||
return
|
||||
await jdownloader.device.system.exit_jd()
|
||||
if await aiopath.exists("cfg.zip"):
|
||||
@ -457,16 +469,15 @@ async def update_private_file(_, message, pre_message):
|
||||
handler_dict[message.chat.id] = False
|
||||
if not message.media and (file_name := message.text):
|
||||
fn = file_name.rsplit(".zip", 1)[0]
|
||||
if await aiopath.isfile(fn) and file_name != "config.env":
|
||||
if await aiopath.isfile(fn) and file_name != "config.py":
|
||||
await remove(fn)
|
||||
if fn == "accounts":
|
||||
if await aiopath.exists("accounts"):
|
||||
await rmtree("accounts", ignore_errors=True)
|
||||
if await aiopath.exists("rclone_sa"):
|
||||
await rmtree("rclone_sa", ignore_errors=True)
|
||||
config_dict["USE_SERVICE_ACCOUNTS"] = False
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_config({"USE_SERVICE_ACCOUNTS": False})
|
||||
Config.USE_SERVICE_ACCOUNTS = False
|
||||
await database.update_config({"USE_SERVICE_ACCOUNTS": False})
|
||||
elif file_name in [".netrc", "netrc"]:
|
||||
await (await create_subprocess_exec("touch", ".netrc")).wait()
|
||||
await (await create_subprocess_exec("chmod", "600", ".netrc")).wait()
|
||||
@ -474,7 +485,10 @@ async def update_private_file(_, message, pre_message):
|
||||
await delete_message(message)
|
||||
elif doc := message.document:
|
||||
file_name = doc.file_name
|
||||
await message.download(file_name=f"{getcwd()}/{file_name}")
|
||||
fpath = f"{getcwd()}/{file_name}"
|
||||
if await aiopath.exists(fpath):
|
||||
await remove(fpath)
|
||||
await message.download(file_name=fpath)
|
||||
if file_name == "accounts.zip":
|
||||
if await aiopath.exists("accounts"):
|
||||
await rmtree("accounts", ignore_errors=True)
|
||||
@ -492,10 +506,10 @@ async def update_private_file(_, message, pre_message):
|
||||
drives_ids.clear()
|
||||
drives_names.clear()
|
||||
index_urls.clear()
|
||||
if GDRIVE_ID := config_dict["GDRIVE_ID"]:
|
||||
if Config.GDRIVE_ID:
|
||||
drives_names.append("Main")
|
||||
drives_ids.append(GDRIVE_ID)
|
||||
index_urls.append(config_dict["INDEX_URL"])
|
||||
drives_ids.append(Config.GDRIVE_ID)
|
||||
index_urls.append(Config.INDEX_URL)
|
||||
async with aiopen("list_drives.txt", "r+") as f:
|
||||
lines = await f.readlines()
|
||||
for line in lines:
|
||||
@ -512,10 +526,9 @@ async def update_private_file(_, message, pre_message):
|
||||
file_name = ".netrc"
|
||||
await (await create_subprocess_exec("chmod", "600", ".netrc")).wait()
|
||||
await (await create_subprocess_exec("cp", ".netrc", "/root/.netrc")).wait()
|
||||
elif file_name == "config.env":
|
||||
load_dotenv("config.env", override=True)
|
||||
elif file_name == "config.py":
|
||||
await load_config()
|
||||
if "@github.com" in config_dict["UPSTREAM_REPO"]:
|
||||
if "@github.com" in Config.UPSTREAM_REPO:
|
||||
buttons = ButtonMaker()
|
||||
msg = "Push to UPSTREAM_REPO ?"
|
||||
buttons.data_button("Yes!", f"botset push {file_name}")
|
||||
@ -526,8 +539,7 @@ async def update_private_file(_, message, pre_message):
|
||||
if file_name == "rclone.conf":
|
||||
await rclone_serve_booter()
|
||||
await update_buttons(pre_message)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_private_file(file_name)
|
||||
await database.update_private_file(file_name)
|
||||
if await aiopath.exists("accounts.zip"):
|
||||
await remove("accounts.zip")
|
||||
|
||||
@ -570,7 +582,7 @@ async def edit_bot_settings(client, query):
|
||||
globals()["start"] = 0
|
||||
await update_buttons(message, None)
|
||||
elif data[1] == "syncjd":
|
||||
if not config_dict["JD_EMAIL"] or not config_dict["JD_PASS"]:
|
||||
if not Config.JD_EMAIL or not Config.JD_PASS:
|
||||
await query.answer(
|
||||
"No Email or Password provided!",
|
||||
show_alert=True,
|
||||
@ -604,8 +616,8 @@ async def edit_bot_settings(client, query):
|
||||
value, update_status_message, key
|
||||
)
|
||||
elif data[2] == "EXTENSION_FILTER":
|
||||
global_extension_filter.clear()
|
||||
global_extension_filter.extend(["aria2", "!qB"])
|
||||
extension_filter.clear()
|
||||
extension_filter.extend(["aria2", "!qB"])
|
||||
elif data[2] == "TORRENT_TIMEOUT":
|
||||
downloads = await sync_to_async(aria2.get_downloads)
|
||||
for download in downloads:
|
||||
@ -619,13 +631,12 @@ async def edit_bot_settings(client, query):
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
aria2_options["bt-stop-timeout"] = "0"
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_aria2("bt-stop-timeout", "0")
|
||||
await database.update_aria2("bt-stop-timeout", "0")
|
||||
elif data[2] == "BASE_URL":
|
||||
await (await create_subprocess_exec("pkill", "-9", "-f", "gunicorn")).wait()
|
||||
elif data[2] == "BASE_URL_PORT":
|
||||
value = 80
|
||||
if config_dict["BASE_URL"]:
|
||||
if Config.BASE_URL:
|
||||
await (
|
||||
await create_subprocess_exec("pkill", "-9", "-f", "gunicorn")
|
||||
).wait()
|
||||
@ -640,19 +651,18 @@ async def edit_bot_settings(client, query):
|
||||
elif data[2] == "INDEX_URL":
|
||||
if drives_names and drives_names[0] == "Main":
|
||||
index_urls[0] = ""
|
||||
elif data[2] == "INCOMPLETE_TASK_NOTIFIER" and config_dict["DATABASE_URL"]:
|
||||
elif data[2] == "INCOMPLETE_TASK_NOTIFIER":
|
||||
await database.trunc_table("tasks")
|
||||
elif data[2] in ["JD_EMAIL", "JD_PASS"]:
|
||||
await create_subprocess_exec("pkill", "-9", "-f", "java")
|
||||
elif data[2] == "USENET_SERVERS":
|
||||
for s in config_dict["USENET_SERVERS"]:
|
||||
for s in Config.USENET_SERVERS:
|
||||
await sabnzbd_client.delete_config("servers", s["name"])
|
||||
config_dict[data[2]] = value
|
||||
Config.set(data[2], value)
|
||||
await update_buttons(message, "var")
|
||||
if data[2] == "DATABASE_URL":
|
||||
await database.disconnect()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_config({data[2]: value})
|
||||
await database.update_config({data[2]: value})
|
||||
if data[2] in ["SEARCH_PLUGINS", "SEARCH_API_LINK"]:
|
||||
await initiate_search_tools()
|
||||
elif data[2] in ["QUEUE_ALL", "QUEUE_DOWNLOAD", "QUEUE_UPLOAD"]:
|
||||
@ -682,29 +692,27 @@ async def edit_bot_settings(client, query):
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_aria2(data[2], value)
|
||||
await database.update_aria2(data[2], value)
|
||||
elif data[1] == "resetnzb":
|
||||
await query.answer()
|
||||
res = await sabnzbd_client.set_config_default(data[2])
|
||||
nzb_options[data[2]] = res["config"]["misc"][data[2]]
|
||||
await update_buttons(message, "nzb")
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_nzb_config()
|
||||
await database.update_nzb_config()
|
||||
elif data[1] == "syncnzb":
|
||||
await query.answer(
|
||||
"Syncronization Started. It takes up to 2 sec!", show_alert=True
|
||||
)
|
||||
await get_nzb_options()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_nzb_config()
|
||||
nzb_options.clear()
|
||||
await update_nzb_options()
|
||||
await database.update_nzb_config()
|
||||
elif data[1] == "syncqbit":
|
||||
await query.answer(
|
||||
"Syncronization Started. It takes up to 2 sec!", show_alert=True
|
||||
)
|
||||
await sync_to_async(get_qb_options)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.save_qbit_settings()
|
||||
qbit_options.clear()
|
||||
await sync_to_async(update_qb_options)
|
||||
await database.save_qbit_settings()
|
||||
elif data[1] == "emptyaria":
|
||||
await query.answer()
|
||||
aria2_options[data[2]] = ""
|
||||
@ -718,33 +726,28 @@ async def edit_bot_settings(client, query):
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
if Config.DATABASE_URL:
|
||||
await database.update_aria2(data[2], "")
|
||||
elif data[1] == "emptyqbit":
|
||||
await query.answer()
|
||||
await sync_to_async(qbittorrent_client.app_set_preferences, {data[2]: value})
|
||||
qbit_options[data[2]] = ""
|
||||
await update_buttons(message, "qbit")
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_qbittorrent(data[2], "")
|
||||
await database.update_qbittorrent(data[2], "")
|
||||
elif data[1] == "emptynzb":
|
||||
await query.answer()
|
||||
res = await sabnzbd_client.set_config("misc", data[2], "")
|
||||
nzb_options[data[2]] = res["config"]["misc"][data[2]]
|
||||
await update_buttons(message, "nzb")
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_nzb_config()
|
||||
await database.update_nzb_config()
|
||||
elif data[1] == "remser":
|
||||
index = int(data[2])
|
||||
await sabnzbd_client.delete_config(
|
||||
"servers", config_dict["USENET_SERVERS"][index]["name"]
|
||||
"servers", Config.USENET_SERVERS[index]["name"]
|
||||
)
|
||||
del config_dict["USENET_SERVERS"][index]
|
||||
del Config.USENET_SERVERS[index]
|
||||
await update_buttons(message, "nzbserver")
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_config(
|
||||
{"USENET_SERVERS": config_dict["USENET_SERVERS"]}
|
||||
)
|
||||
await database.update_config({"USENET_SERVERS": Config.USENET_SERVERS})
|
||||
elif data[1] == "private":
|
||||
await query.answer()
|
||||
await update_buttons(message, data[1])
|
||||
@ -758,7 +761,7 @@ async def edit_bot_settings(client, query):
|
||||
rfunc = partial(update_buttons, message, "var")
|
||||
await event_handler(client, query, pfunc, rfunc)
|
||||
elif data[1] == "botvar" and state == "view":
|
||||
value = f"{config_dict[data[2]]}"
|
||||
value = f"{Config.get(data[2])}"
|
||||
if len(value) > 200:
|
||||
await query.answer()
|
||||
with BytesIO(str.encode(value)) as out_file:
|
||||
@ -824,15 +827,10 @@ async def edit_bot_settings(client, query):
|
||||
await update_buttons(message, f"nzbser{data[2]}")
|
||||
index = int(data[2])
|
||||
res = await sabnzbd_client.add_server(
|
||||
{"name": config_dict["USENET_SERVERS"][index]["name"], data[3]: ""}
|
||||
{"name": Config.USENET_SERVERS[index]["name"], data[3]: ""}
|
||||
)
|
||||
config_dict["USENET_SERVERS"][index][data[3]] = res["config"]["servers"][0][
|
||||
data[3]
|
||||
]
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_config(
|
||||
{"USENET_SERVERS": config_dict["USENET_SERVERS"]}
|
||||
)
|
||||
Config.USENET_SERVERS[index][data[3]] = res["config"]["servers"][0][data[3]]
|
||||
await database.update_config({"USENET_SERVERS": Config.USENET_SERVERS})
|
||||
elif data[1].startswith("nzbsevar") and (state == "edit" or data[2] == "newser"):
|
||||
index = 0 if data[2] == "newser" else int(data[1].replace("nzbsevar", ""))
|
||||
await query.answer()
|
||||
@ -842,7 +840,7 @@ async def edit_bot_settings(client, query):
|
||||
await event_handler(client, query, pfunc, rfunc)
|
||||
elif data[1].startswith("nzbsevar") and state == "view":
|
||||
index = int(data[1].replace("nzbsevar", ""))
|
||||
value = f"{config_dict['USENET_SERVERS'][index][data[2]]}"
|
||||
value = f"{Config.USENET_SERVERS[index][data[2]]}"
|
||||
if len(value) > 200:
|
||||
await query.answer()
|
||||
with BytesIO(str.encode(value)) as out_file:
|
||||
@ -872,16 +870,16 @@ async def edit_bot_settings(client, query):
|
||||
await (
|
||||
await create_subprocess_shell(
|
||||
f"git add -f {filename} \
|
||||
&& git commit -sm botsettings -q \
|
||||
&& git push origin {config_dict['UPSTREAM_BRANCH']} -qf"
|
||||
&& git commit -sm botsettings -q \
|
||||
&& git push origin {Config.UPSTREAM_BRANCH} -qf"
|
||||
)
|
||||
).wait()
|
||||
else:
|
||||
await (
|
||||
await create_subprocess_shell(
|
||||
f"git rm -r --cached {filename} \
|
||||
&& git commit -sm botsettings -q \
|
||||
&& git push origin {config_dict['UPSTREAM_BRANCH']} -qf"
|
||||
&& git commit -sm botsettings -q \
|
||||
&& git push origin {Config.UPSTREAM_BRANCH} -qf"
|
||||
)
|
||||
).wait()
|
||||
await delete_message(message.reply_to_message)
|
||||
@ -889,7 +887,7 @@ async def edit_bot_settings(client, query):
|
||||
|
||||
|
||||
@new_task
|
||||
async def bot_settings(_, message):
|
||||
async def send_bot_settings(_, message):
|
||||
handler_dict[message.chat.id] = False
|
||||
msg, button = await get_buttons()
|
||||
globals()["start"] = 0
|
||||
@ -897,171 +895,21 @@ async def bot_settings(_, message):
|
||||
|
||||
|
||||
async def load_config():
|
||||
BOT_TOKEN = environ.get("BOT_TOKEN", "")
|
||||
if len(BOT_TOKEN) == 0:
|
||||
BOT_TOKEN = config_dict["BOT_TOKEN"]
|
||||
Config.load()
|
||||
drives_ids.clear()
|
||||
drives_names.clear()
|
||||
index_urls.clear()
|
||||
await update_variables()
|
||||
|
||||
TELEGRAM_API = environ.get("TELEGRAM_API", "")
|
||||
if len(TELEGRAM_API) == 0:
|
||||
TELEGRAM_API = config_dict["TELEGRAM_API"]
|
||||
else:
|
||||
TELEGRAM_API = int(TELEGRAM_API)
|
||||
|
||||
TELEGRAM_HASH = environ.get("TELEGRAM_HASH", "")
|
||||
if len(TELEGRAM_HASH) == 0:
|
||||
TELEGRAM_HASH = config_dict["TELEGRAM_HASH"]
|
||||
|
||||
OWNER_ID = environ.get("OWNER_ID", "")
|
||||
OWNER_ID = config_dict["OWNER_ID"] if len(OWNER_ID) == 0 else int(OWNER_ID)
|
||||
|
||||
DATABASE_URL = environ.get("DATABASE_URL", "")
|
||||
if len(DATABASE_URL) == 0:
|
||||
DATABASE_URL = ""
|
||||
|
||||
DOWNLOAD_DIR = environ.get("DOWNLOAD_DIR", "")
|
||||
if len(DOWNLOAD_DIR) == 0:
|
||||
DOWNLOAD_DIR = "/usr/src/app/downloads/"
|
||||
elif not DOWNLOAD_DIR.endswith("/"):
|
||||
DOWNLOAD_DIR = f"{DOWNLOAD_DIR}/"
|
||||
|
||||
GDRIVE_ID = environ.get("GDRIVE_ID", "")
|
||||
if len(GDRIVE_ID) == 0:
|
||||
GDRIVE_ID = ""
|
||||
|
||||
RCLONE_PATH = environ.get("RCLONE_PATH", "")
|
||||
if len(RCLONE_PATH) == 0:
|
||||
RCLONE_PATH = ""
|
||||
|
||||
DEFAULT_UPLOAD = environ.get("DEFAULT_UPLOAD", "")
|
||||
if DEFAULT_UPLOAD != "gd":
|
||||
DEFAULT_UPLOAD = "rc"
|
||||
|
||||
RCLONE_FLAGS = environ.get("RCLONE_FLAGS", "")
|
||||
if len(RCLONE_FLAGS) == 0:
|
||||
RCLONE_FLAGS = ""
|
||||
|
||||
AUTHORIZED_CHATS = environ.get("AUTHORIZED_CHATS", "")
|
||||
if len(AUTHORIZED_CHATS) != 0:
|
||||
aid = AUTHORIZED_CHATS.split()
|
||||
for id_ in aid:
|
||||
chat_id, *thread_ids = id_.split("|")
|
||||
chat_id = int(chat_id.strip())
|
||||
if thread_ids:
|
||||
thread_ids = list(map(lambda x: int(x.strip()), thread_ids))
|
||||
user_data[chat_id] = {"is_auth": True, "thread_ids": thread_ids}
|
||||
else:
|
||||
user_data[chat_id] = {"is_auth": True}
|
||||
|
||||
SUDO_USERS = environ.get("SUDO_USERS", "")
|
||||
if len(SUDO_USERS) != 0:
|
||||
aid = SUDO_USERS.split()
|
||||
for id_ in aid:
|
||||
user_data[int(id_.strip())] = {"is_sudo": True}
|
||||
|
||||
EXTENSION_FILTER = environ.get("EXTENSION_FILTER", "")
|
||||
if len(EXTENSION_FILTER) > 0:
|
||||
fx = EXTENSION_FILTER.split()
|
||||
global_extension_filter.clear()
|
||||
global_extension_filter.extend(["aria2", "!qB"])
|
||||
for x in fx:
|
||||
if x.strip().startswith("."):
|
||||
x = x.lstrip(".")
|
||||
global_extension_filter.append(x.strip().lower())
|
||||
|
||||
JD_EMAIL = environ.get("JD_EMAIL", "")
|
||||
JD_PASS = environ.get("JD_PASS", "")
|
||||
if len(JD_EMAIL) == 0 or len(JD_PASS) == 0:
|
||||
JD_EMAIL = ""
|
||||
JD_PASS = ""
|
||||
|
||||
USENET_SERVERS = environ.get("USENET_SERVERS", "")
|
||||
try:
|
||||
if len(USENET_SERVERS) == 0:
|
||||
USENET_SERVERS = []
|
||||
elif (us := eval(USENET_SERVERS)) and not us[0].get("host"):
|
||||
USENET_SERVERS = []
|
||||
else:
|
||||
USENET_SERVERS = eval(USENET_SERVERS)
|
||||
except:
|
||||
LOGGER.error(f"Wrong USENET_SERVERS format: {USENET_SERVERS}")
|
||||
USENET_SERVERS = []
|
||||
|
||||
FILELION_API = environ.get("FILELION_API", "")
|
||||
if len(FILELION_API) == 0:
|
||||
FILELION_API = ""
|
||||
|
||||
STREAMWISH_API = environ.get("STREAMWISH_API", "")
|
||||
if len(STREAMWISH_API) == 0:
|
||||
STREAMWISH_API = ""
|
||||
|
||||
INDEX_URL = environ.get("INDEX_URL", "").rstrip("/")
|
||||
if len(INDEX_URL) == 0:
|
||||
INDEX_URL = ""
|
||||
|
||||
SEARCH_API_LINK = environ.get("SEARCH_API_LINK", "").rstrip("/")
|
||||
if len(SEARCH_API_LINK) == 0:
|
||||
SEARCH_API_LINK = ""
|
||||
|
||||
LEECH_FILENAME_PREFIX = environ.get("LEECH_FILENAME_PREFIX", "")
|
||||
if len(LEECH_FILENAME_PREFIX) == 0:
|
||||
LEECH_FILENAME_PREFIX = ""
|
||||
|
||||
SEARCH_PLUGINS = environ.get("SEARCH_PLUGINS", "")
|
||||
if len(SEARCH_PLUGINS) == 0:
|
||||
SEARCH_PLUGINS = ""
|
||||
else:
|
||||
try:
|
||||
SEARCH_PLUGINS = eval(SEARCH_PLUGINS)
|
||||
except:
|
||||
LOGGER.error(f"Wrong SEARCH_PLUGINS fornat {SEARCH_PLUGINS}")
|
||||
SEARCH_PLUGINS = ""
|
||||
|
||||
MAX_SPLIT_SIZE = 4194304000 if IS_PREMIUM_USER else 2097152000
|
||||
|
||||
LEECH_SPLIT_SIZE = environ.get("LEECH_SPLIT_SIZE", "")
|
||||
if len(LEECH_SPLIT_SIZE) == 0 or int(LEECH_SPLIT_SIZE) > MAX_SPLIT_SIZE:
|
||||
LEECH_SPLIT_SIZE = MAX_SPLIT_SIZE
|
||||
else:
|
||||
LEECH_SPLIT_SIZE = int(LEECH_SPLIT_SIZE)
|
||||
|
||||
STATUS_UPDATE_INTERVAL = environ.get("STATUS_UPDATE_INTERVAL", "")
|
||||
if len(STATUS_UPDATE_INTERVAL) == 0:
|
||||
STATUS_UPDATE_INTERVAL = 15
|
||||
else:
|
||||
STATUS_UPDATE_INTERVAL = int(STATUS_UPDATE_INTERVAL)
|
||||
if len(task_dict) != 0 and (st := intervals["status"]):
|
||||
for key, intvl in list(st.items()):
|
||||
intvl.cancel()
|
||||
intervals["status"][key] = SetInterval(
|
||||
STATUS_UPDATE_INTERVAL, update_status_message, key
|
||||
Config.STATUS_UPDATE_INTERVAL, update_status_message, key
|
||||
)
|
||||
|
||||
YT_DLP_OPTIONS = environ.get("YT_DLP_OPTIONS", "")
|
||||
if len(YT_DLP_OPTIONS) == 0:
|
||||
YT_DLP_OPTIONS = ""
|
||||
|
||||
SEARCH_LIMIT = environ.get("SEARCH_LIMIT", "")
|
||||
SEARCH_LIMIT = 0 if len(SEARCH_LIMIT) == 0 else int(SEARCH_LIMIT)
|
||||
|
||||
LEECH_DUMP_CHAT = environ.get("LEECH_DUMP_CHAT", "")
|
||||
LEECH_DUMP_CHAT = "" if len(LEECH_DUMP_CHAT) == 0 else LEECH_DUMP_CHAT
|
||||
|
||||
STATUS_LIMIT = environ.get("STATUS_LIMIT", "")
|
||||
STATUS_LIMIT = 4 if len(STATUS_LIMIT) == 0 else int(STATUS_LIMIT)
|
||||
|
||||
RSS_CHAT = environ.get("RSS_CHAT", "")
|
||||
RSS_CHAT = "" if len(RSS_CHAT) == 0 else RSS_CHAT
|
||||
|
||||
RSS_DELAY = environ.get("RSS_DELAY", "")
|
||||
RSS_DELAY = 600 if len(RSS_DELAY) == 0 else int(RSS_DELAY)
|
||||
|
||||
CMD_SUFFIX = environ.get("CMD_SUFFIX", "")
|
||||
|
||||
USER_SESSION_STRING = environ.get("USER_SESSION_STRING", "")
|
||||
|
||||
TORRENT_TIMEOUT = environ.get("TORRENT_TIMEOUT", "")
|
||||
downloads = aria2.get_downloads()
|
||||
if len(TORRENT_TIMEOUT) == 0:
|
||||
if not Config.TORRENT_TIMEOUT:
|
||||
for download in downloads:
|
||||
if not download.is_complete:
|
||||
try:
|
||||
@ -1073,9 +921,7 @@ async def load_config():
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
aria2_options["bt-stop-timeout"] = "0"
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_aria2("bt-stop-timeout", "0")
|
||||
TORRENT_TIMEOUT = ""
|
||||
await database.update_aria2("bt-stop-timeout", "0")
|
||||
else:
|
||||
for download in downloads:
|
||||
if not download.is_complete:
|
||||
@ -1083,205 +929,27 @@ async def load_config():
|
||||
await sync_to_async(
|
||||
aria2.client.change_option,
|
||||
download.gid,
|
||||
{"bt-stop-timeout": TORRENT_TIMEOUT},
|
||||
{"bt-stop-timeout": Config.TORRENT_TIMEOUT},
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
aria2_options["bt-stop-timeout"] = TORRENT_TIMEOUT
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_aria2("bt-stop-timeout", TORRENT_TIMEOUT)
|
||||
TORRENT_TIMEOUT = int(TORRENT_TIMEOUT)
|
||||
aria2_options["bt-stop-timeout"] = Config.TORRENT_TIMEOUT
|
||||
await database.update_aria2("bt-stop-timeout", Config.TORRENT_TIMEOUT)
|
||||
|
||||
QUEUE_ALL = environ.get("QUEUE_ALL", "")
|
||||
QUEUE_ALL = "" if len(QUEUE_ALL) == 0 else int(QUEUE_ALL)
|
||||
|
||||
QUEUE_DOWNLOAD = environ.get("QUEUE_DOWNLOAD", "")
|
||||
QUEUE_DOWNLOAD = "" if len(QUEUE_DOWNLOAD) == 0 else int(QUEUE_DOWNLOAD)
|
||||
|
||||
QUEUE_UPLOAD = environ.get("QUEUE_UPLOAD", "")
|
||||
QUEUE_UPLOAD = "" if len(QUEUE_UPLOAD) == 0 else int(QUEUE_UPLOAD)
|
||||
|
||||
INCOMPLETE_TASK_NOTIFIER = environ.get("INCOMPLETE_TASK_NOTIFIER", "")
|
||||
INCOMPLETE_TASK_NOTIFIER = INCOMPLETE_TASK_NOTIFIER.lower() == "true"
|
||||
if not INCOMPLETE_TASK_NOTIFIER and config_dict["DATABASE_URL"]:
|
||||
if not Config.INCOMPLETE_TASK_NOTIFIER:
|
||||
await database.trunc_table("tasks")
|
||||
|
||||
STOP_DUPLICATE = environ.get("STOP_DUPLICATE", "")
|
||||
STOP_DUPLICATE = STOP_DUPLICATE.lower() == "true"
|
||||
|
||||
IS_TEAM_DRIVE = environ.get("IS_TEAM_DRIVE", "")
|
||||
IS_TEAM_DRIVE = IS_TEAM_DRIVE.lower() == "true"
|
||||
|
||||
USE_SERVICE_ACCOUNTS = environ.get("USE_SERVICE_ACCOUNTS", "")
|
||||
USE_SERVICE_ACCOUNTS = USE_SERVICE_ACCOUNTS.lower() == "true"
|
||||
|
||||
WEB_PINCODE = environ.get("WEB_PINCODE", "")
|
||||
WEB_PINCODE = WEB_PINCODE.lower() == "true"
|
||||
|
||||
AS_DOCUMENT = environ.get("AS_DOCUMENT", "")
|
||||
AS_DOCUMENT = AS_DOCUMENT.lower() == "true"
|
||||
|
||||
EQUAL_SPLITS = environ.get("EQUAL_SPLITS", "")
|
||||
EQUAL_SPLITS = EQUAL_SPLITS.lower() == "true"
|
||||
|
||||
MEDIA_GROUP = environ.get("MEDIA_GROUP", "")
|
||||
MEDIA_GROUP = MEDIA_GROUP.lower() == "true"
|
||||
|
||||
USER_TRANSMISSION = environ.get("USER_TRANSMISSION", "")
|
||||
USER_TRANSMISSION = USER_TRANSMISSION.lower() == "true" and IS_PREMIUM_USER
|
||||
|
||||
BASE_URL_PORT = environ.get("BASE_URL_PORT", "")
|
||||
BASE_URL_PORT = 80 if len(BASE_URL_PORT) == 0 else int(BASE_URL_PORT)
|
||||
|
||||
RCLONE_SERVE_URL = environ.get("RCLONE_SERVE_URL", "")
|
||||
if len(RCLONE_SERVE_URL) == 0:
|
||||
RCLONE_SERVE_URL = ""
|
||||
|
||||
RCLONE_SERVE_PORT = environ.get("RCLONE_SERVE_PORT", "")
|
||||
RCLONE_SERVE_PORT = 8080 if len(RCLONE_SERVE_PORT) == 0 else int(RCLONE_SERVE_PORT)
|
||||
|
||||
RCLONE_SERVE_USER = environ.get("RCLONE_SERVE_USER", "")
|
||||
if len(RCLONE_SERVE_USER) == 0:
|
||||
RCLONE_SERVE_USER = ""
|
||||
|
||||
RCLONE_SERVE_PASS = environ.get("RCLONE_SERVE_PASS", "")
|
||||
if len(RCLONE_SERVE_PASS) == 0:
|
||||
RCLONE_SERVE_PASS = ""
|
||||
|
||||
NAME_SUBSTITUTE = environ.get("NAME_SUBSTITUTE", "")
|
||||
NAME_SUBSTITUTE = "" if len(NAME_SUBSTITUTE) == 0 else NAME_SUBSTITUTE
|
||||
|
||||
MIXED_LEECH = environ.get("MIXED_LEECH", "")
|
||||
MIXED_LEECH = MIXED_LEECH.lower() == "true" and IS_PREMIUM_USER
|
||||
|
||||
THUMBNAIL_LAYOUT = environ.get("THUMBNAIL_LAYOUT", "")
|
||||
THUMBNAIL_LAYOUT = "" if len(THUMBNAIL_LAYOUT) == 0 else THUMBNAIL_LAYOUT
|
||||
|
||||
FFMPEG_CMDS = environ.get("FFMPEG_CMDS", "")
|
||||
try:
|
||||
FFMPEG_CMDS = [] if len(FFMPEG_CMDS) == 0 else eval(FFMPEG_CMDS)
|
||||
except:
|
||||
LOGGER.error(f"Wrong FFMPEG_CMDS format: {FFMPEG_CMDS}")
|
||||
FFMPEG_CMDS = []
|
||||
|
||||
await (await create_subprocess_exec("pkill", "-9", "-f", "gunicorn")).wait()
|
||||
BASE_URL = environ.get("BASE_URL", "").rstrip("/")
|
||||
if len(BASE_URL) == 0:
|
||||
BASE_URL = ""
|
||||
else:
|
||||
if Config.BASE_URL:
|
||||
await create_subprocess_shell(
|
||||
f"gunicorn web.wserver:app --bind 0.0.0.0:{BASE_URL_PORT} --worker-class gevent"
|
||||
f"gunicorn web.wserver:app --bind 0.0.0.0:{Config.BASE_URL_PORT} --worker-class gevent"
|
||||
)
|
||||
|
||||
UPSTREAM_REPO = environ.get("UPSTREAM_REPO", "")
|
||||
if len(UPSTREAM_REPO) == 0:
|
||||
UPSTREAM_REPO = ""
|
||||
|
||||
UPSTREAM_BRANCH = environ.get("UPSTREAM_BRANCH", "")
|
||||
if len(UPSTREAM_BRANCH) == 0:
|
||||
UPSTREAM_BRANCH = "master"
|
||||
|
||||
drives_ids.clear()
|
||||
drives_names.clear()
|
||||
index_urls.clear()
|
||||
|
||||
if GDRIVE_ID:
|
||||
drives_names.append("Main")
|
||||
drives_ids.append(GDRIVE_ID)
|
||||
index_urls.append(INDEX_URL)
|
||||
|
||||
if await aiopath.exists("list_drives.txt"):
|
||||
async with aiopen("list_drives.txt", "r+") as f:
|
||||
lines = await f.readlines()
|
||||
for line in lines:
|
||||
temp = line.strip().split()
|
||||
drives_ids.append(temp[1])
|
||||
drives_names.append(temp[0].replace("_", " "))
|
||||
if len(temp) > 2:
|
||||
index_urls.append(temp[2])
|
||||
else:
|
||||
index_urls.append("")
|
||||
|
||||
config_dict.update(
|
||||
{
|
||||
"AS_DOCUMENT": AS_DOCUMENT,
|
||||
"AUTHORIZED_CHATS": AUTHORIZED_CHATS,
|
||||
"BASE_URL": BASE_URL,
|
||||
"BASE_URL_PORT": BASE_URL_PORT,
|
||||
"BOT_TOKEN": BOT_TOKEN,
|
||||
"CMD_SUFFIX": CMD_SUFFIX,
|
||||
"DATABASE_URL": DATABASE_URL,
|
||||
"DEFAULT_UPLOAD": DEFAULT_UPLOAD,
|
||||
"DOWNLOAD_DIR": DOWNLOAD_DIR,
|
||||
"EQUAL_SPLITS": EQUAL_SPLITS,
|
||||
"EXTENSION_FILTER": EXTENSION_FILTER,
|
||||
"FFMPEG_CMDS": FFMPEG_CMDS,
|
||||
"FILELION_API": FILELION_API,
|
||||
"GDRIVE_ID": GDRIVE_ID,
|
||||
"INCOMPLETE_TASK_NOTIFIER": INCOMPLETE_TASK_NOTIFIER,
|
||||
"INDEX_URL": INDEX_URL,
|
||||
"IS_TEAM_DRIVE": IS_TEAM_DRIVE,
|
||||
"JD_EMAIL": JD_EMAIL,
|
||||
"JD_PASS": JD_PASS,
|
||||
"LEECH_DUMP_CHAT": LEECH_DUMP_CHAT,
|
||||
"LEECH_FILENAME_PREFIX": LEECH_FILENAME_PREFIX,
|
||||
"LEECH_SPLIT_SIZE": LEECH_SPLIT_SIZE,
|
||||
"MEDIA_GROUP": MEDIA_GROUP,
|
||||
"MIXED_LEECH": MIXED_LEECH,
|
||||
"NAME_SUBSTITUTE": NAME_SUBSTITUTE,
|
||||
"OWNER_ID": OWNER_ID,
|
||||
"QUEUE_ALL": QUEUE_ALL,
|
||||
"QUEUE_DOWNLOAD": QUEUE_DOWNLOAD,
|
||||
"QUEUE_UPLOAD": QUEUE_UPLOAD,
|
||||
"RCLONE_FLAGS": RCLONE_FLAGS,
|
||||
"RCLONE_PATH": RCLONE_PATH,
|
||||
"RCLONE_SERVE_URL": RCLONE_SERVE_URL,
|
||||
"RCLONE_SERVE_USER": RCLONE_SERVE_USER,
|
||||
"RCLONE_SERVE_PASS": RCLONE_SERVE_PASS,
|
||||
"RCLONE_SERVE_PORT": RCLONE_SERVE_PORT,
|
||||
"RSS_CHAT": RSS_CHAT,
|
||||
"RSS_DELAY": RSS_DELAY,
|
||||
"SEARCH_API_LINK": SEARCH_API_LINK,
|
||||
"SEARCH_LIMIT": SEARCH_LIMIT,
|
||||
"SEARCH_PLUGINS": SEARCH_PLUGINS,
|
||||
"STATUS_LIMIT": STATUS_LIMIT,
|
||||
"STATUS_UPDATE_INTERVAL": STATUS_UPDATE_INTERVAL,
|
||||
"STOP_DUPLICATE": STOP_DUPLICATE,
|
||||
"STREAMWISH_API": STREAMWISH_API,
|
||||
"SUDO_USERS": SUDO_USERS,
|
||||
"TELEGRAM_API": TELEGRAM_API,
|
||||
"TELEGRAM_HASH": TELEGRAM_HASH,
|
||||
"THUMBNAIL_LAYOUT": THUMBNAIL_LAYOUT,
|
||||
"TORRENT_TIMEOUT": TORRENT_TIMEOUT,
|
||||
"USER_TRANSMISSION": USER_TRANSMISSION,
|
||||
"UPSTREAM_REPO": UPSTREAM_REPO,
|
||||
"UPSTREAM_BRANCH": UPSTREAM_BRANCH,
|
||||
"USENET_SERVERS": USENET_SERVERS,
|
||||
"USER_SESSION_STRING": USER_SESSION_STRING,
|
||||
"USE_SERVICE_ACCOUNTS": USE_SERVICE_ACCOUNTS,
|
||||
"WEB_PINCODE": WEB_PINCODE,
|
||||
"YT_DLP_OPTIONS": YT_DLP_OPTIONS,
|
||||
}
|
||||
)
|
||||
|
||||
if config_dict["DATABASE_URL"]:
|
||||
if Config.DATABASE_URL:
|
||||
await database.connect()
|
||||
config_dict = Config.get_all()
|
||||
await database.update_config(config_dict)
|
||||
else:
|
||||
await database.disconnect()
|
||||
await gather(initiate_search_tools(), start_from_queued(), rclone_serve_booter())
|
||||
add_job()
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
bot_settings,
|
||||
filters=command(BotCommands.BotSetCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
CallbackQueryHandler(
|
||||
edit_bot_settings, filters=regex("^botset") & CustomFilters.sudo
|
||||
)
|
||||
)
|
||||
|
@ -1,8 +1,7 @@
|
||||
from asyncio import sleep
|
||||
from pyrogram.filters import command, regex
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
|
||||
from bot import task_dict, bot, task_dict_lock, OWNER_ID, user_data, multi_tags
|
||||
from .. import task_dict, task_dict_lock, user_data, multi_tags
|
||||
from ..core.mltb_client import Config
|
||||
from ..helper.ext_utils.bot_utils import new_task
|
||||
from ..helper.ext_utils.status_utils import (
|
||||
get_task_by_gid,
|
||||
@ -21,7 +20,7 @@ from ..helper.telegram_helper.message_utils import (
|
||||
|
||||
|
||||
@new_task
|
||||
async def cancel_task(_, message):
|
||||
async def cancel(_, message):
|
||||
user_id = message.from_user.id if message.from_user else message.sender_chat.id
|
||||
msg = message.text.split()
|
||||
if len(msg) > 1:
|
||||
@ -48,7 +47,7 @@ async def cancel_task(_, message):
|
||||
await send_message(message, msg)
|
||||
return
|
||||
if (
|
||||
OWNER_ID != user_id
|
||||
Config.OWNER_ID != user_id
|
||||
and task.listener.user_id != user_id
|
||||
and (user_id not in user_data or not user_data[user_id].get("is_sudo"))
|
||||
):
|
||||
@ -95,9 +94,7 @@ def create_cancel_buttons(is_sudo, user_id=""):
|
||||
"Uploading", f"canall ms {MirrorStatus.STATUS_UPLOAD} {user_id}"
|
||||
)
|
||||
buttons.data_button("Seeding", f"canall ms {MirrorStatus.STATUS_SEED} {user_id}")
|
||||
buttons.data_button(
|
||||
"Spltting", f"canall ms {MirrorStatus.STATUS_SPLIT} {user_id}"
|
||||
)
|
||||
buttons.data_button("Spltting", f"canall ms {MirrorStatus.STATUS_SPLIT} {user_id}")
|
||||
buttons.data_button("Cloning", f"canall ms {MirrorStatus.STATUS_CLONE} {user_id}")
|
||||
buttons.data_button(
|
||||
"Extracting", f"canall ms {MirrorStatus.STATUS_EXTRACT} {user_id}"
|
||||
@ -130,7 +127,7 @@ def create_cancel_buttons(is_sudo, user_id=""):
|
||||
|
||||
|
||||
@new_task
|
||||
async def cancell_all_buttons(_, message):
|
||||
async def cancel_all_buttons(_, message):
|
||||
async with task_dict_lock:
|
||||
count = len(task_dict)
|
||||
if count == 0:
|
||||
@ -180,21 +177,3 @@ async def cancel_all_update(_, query):
|
||||
res = await cancel_all(data[1], user_id)
|
||||
if not res:
|
||||
await send_message(reply_to, f"No matching tasks for {data[1]}!")
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
cancel_task,
|
||||
filters=command(BotCommands.CancelTaskCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
cancell_all_buttons,
|
||||
filters=command(BotCommands.CancelAllCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(CallbackQueryHandler(cancel_all_update, filters=regex("^canall")))
|
||||
bot.add_handler(CallbackQueryHandler(cancel_multi, filters=regex("^stopm")))
|
||||
|
@ -1,11 +1,7 @@
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
|
||||
from bot import user_data, config_dict, bot
|
||||
from .. import user_data
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import update_user_ldata, new_task
|
||||
from ..helper.ext_utils.db_handler import database
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import send_message
|
||||
|
||||
|
||||
@ -43,8 +39,7 @@ async def authorize(_, message):
|
||||
update_user_ldata(chat_id, "is_auth", True)
|
||||
if thread_id is not None:
|
||||
update_user_ldata(chat_id, "thread_ids", [thread_id])
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(chat_id)
|
||||
await database.update_user_data(chat_id)
|
||||
msg = "Authorized"
|
||||
await send_message(message, msg)
|
||||
|
||||
@ -73,8 +68,7 @@ async def unauthorize(_, message):
|
||||
user_data[chat_id]["thread_ids"].remove(thread_id)
|
||||
else:
|
||||
update_user_ldata(chat_id, "is_auth", False)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(chat_id)
|
||||
await database.update_user_data(chat_id)
|
||||
msg = "Unauthorized"
|
||||
else:
|
||||
msg = "Already Unauthorized!"
|
||||
@ -94,8 +88,7 @@ async def add_sudo(_, message):
|
||||
msg = "Already Sudo!"
|
||||
else:
|
||||
update_user_ldata(id_, "is_sudo", True)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(id_)
|
||||
await database.update_user_data(id_)
|
||||
msg = "Promoted as Sudo"
|
||||
else:
|
||||
msg = "Give ID or Reply To message of whom you want to Promote."
|
||||
@ -112,39 +105,8 @@ async def remove_sudo(_, message):
|
||||
id_ = reply_to.from_user.id if reply_to.from_user else reply_to.sender_chat.id
|
||||
if id_ and id_ not in user_data or user_data[id_].get("is_sudo"):
|
||||
update_user_ldata(id_, "is_sudo", False)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(id_)
|
||||
await database.update_user_data(id_)
|
||||
msg = "Demoted"
|
||||
else:
|
||||
msg = "Give ID or Reply To message of whom you want to remove from Sudo"
|
||||
await send_message(message, msg)
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
authorize,
|
||||
filters=command(BotCommands.AuthorizeCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
unauthorize,
|
||||
filters=command(BotCommands.UnAuthorizeCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
add_sudo,
|
||||
filters=command(BotCommands.AddSudoCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
remove_sudo,
|
||||
filters=command(BotCommands.RmSudoCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
@ -1,11 +1,9 @@
|
||||
from asyncio import gather
|
||||
from json import loads
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
from secrets import token_urlsafe
|
||||
from aiofiles.os import remove
|
||||
|
||||
from bot import LOGGER, task_dict, task_dict_lock, bot, bot_loop
|
||||
from .. import LOGGER, task_dict, task_dict_lock, bot_loop
|
||||
from ..helper.ext_utils.bot_utils import (
|
||||
sync_to_async,
|
||||
cmd_exec,
|
||||
@ -29,8 +27,6 @@ from ..helper.mirror_leech_utils.gdrive_utils.count import GoogleDriveCount
|
||||
from ..helper.mirror_leech_utils.rclone_utils.transfer import RcloneTransferHelper
|
||||
from ..helper.mirror_leech_utils.status_utils.gdrive_status import GoogleDriveStatus
|
||||
from ..helper.mirror_leech_utils.status_utils.rclone_status import RcloneStatus
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import (
|
||||
send_message,
|
||||
delete_message,
|
||||
@ -195,6 +191,11 @@ class Clone(TaskListener):
|
||||
"--config",
|
||||
config_path,
|
||||
f"{remote}:{src_path}",
|
||||
"--log-systemd",
|
||||
"--log-file",
|
||||
"rlog.txt",
|
||||
"--log-level",
|
||||
"ERROR",
|
||||
]
|
||||
res = await cmd_exec(cmd)
|
||||
if res[2] != 0:
|
||||
@ -297,14 +298,5 @@ class Clone(TaskListener):
|
||||
)
|
||||
|
||||
|
||||
async def clone(client, message):
|
||||
async def clone_node(client, message):
|
||||
bot_loop.create_task(Clone(client, message).new_event())
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
clone,
|
||||
filters=command(BotCommands.CloneCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
|
@ -2,15 +2,12 @@ from aiofiles import open as aiopen
|
||||
from contextlib import redirect_stdout
|
||||
from io import StringIO, BytesIO
|
||||
from os import path as ospath, getcwd, chdir
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
from textwrap import indent
|
||||
from traceback import format_exc
|
||||
|
||||
from bot import LOGGER, bot
|
||||
from .. import LOGGER
|
||||
from ..core.mltb_client import TgClient
|
||||
from ..helper.ext_utils.bot_utils import sync_to_async, new_task
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import send_file, send_message
|
||||
|
||||
namespaces = {}
|
||||
@ -20,7 +17,7 @@ def namespace_of(message):
|
||||
if message.chat.id not in namespaces:
|
||||
namespaces[message.chat.id] = {
|
||||
"__builtins__": globals()["__builtins__"],
|
||||
"bot": bot,
|
||||
"bot": TgClient.bot,
|
||||
"message": message,
|
||||
"user": message.from_user or message.sender_chat,
|
||||
"chat": message.chat,
|
||||
@ -115,26 +112,3 @@ async def clear(_, message):
|
||||
if message.chat.id in namespaces:
|
||||
del namespaces[message.chat.id]
|
||||
await send("Locals Cleared.", message)
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
aioexecute,
|
||||
filters=command(BotCommands.AExecCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
execute,
|
||||
filters=command(BotCommands.ExecCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
clear,
|
||||
filters=command(BotCommands.ClearLocalsCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
|
@ -1,26 +1,20 @@
|
||||
from aiofiles.os import remove, path as aiopath
|
||||
from pyrogram.filters import command, regex
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
|
||||
from bot import (
|
||||
bot,
|
||||
from .. import (
|
||||
aria2,
|
||||
task_dict,
|
||||
task_dict_lock,
|
||||
OWNER_ID,
|
||||
user_data,
|
||||
LOGGER,
|
||||
config_dict,
|
||||
qbittorrent_client,
|
||||
)
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import (
|
||||
bt_selection_buttons,
|
||||
sync_to_async,
|
||||
new_task,
|
||||
)
|
||||
from ..helper.ext_utils.status_utils import get_task_by_gid, MirrorStatus
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import (
|
||||
send_message,
|
||||
send_status_message,
|
||||
@ -30,7 +24,7 @@ from ..helper.telegram_helper.message_utils import (
|
||||
|
||||
@new_task
|
||||
async def select(_, message):
|
||||
if not config_dict["BASE_URL"]:
|
||||
if not Config.BASE_URL:
|
||||
await send_message(message, "Base URL not defined!")
|
||||
return
|
||||
user_id = message.from_user.id
|
||||
@ -57,7 +51,7 @@ async def select(_, message):
|
||||
return
|
||||
|
||||
if (
|
||||
OWNER_ID != user_id
|
||||
Config.OWNER_ID != user_id
|
||||
and task.listener.user_id != user_id
|
||||
and (user_id not in user_data or not user_data[user_id].get("is_sudo"))
|
||||
):
|
||||
@ -105,7 +99,7 @@ async def select(_, message):
|
||||
|
||||
|
||||
@new_task
|
||||
async def get_confirm(_, query):
|
||||
async def confirm_selection(_, query):
|
||||
user_id = query.from_user.id
|
||||
data = query.data.split()
|
||||
message = query.message
|
||||
@ -165,13 +159,3 @@ async def get_confirm(_, query):
|
||||
else:
|
||||
await delete_message(message)
|
||||
await task.cancel_task()
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
select,
|
||||
filters=command(BotCommands.SelectCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(CallbackQueryHandler(get_confirm, filters=regex("^sel")))
|
||||
|
@ -1,16 +1,12 @@
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
|
||||
from bot import (
|
||||
from .. import (
|
||||
task_dict,
|
||||
bot,
|
||||
task_dict_lock,
|
||||
OWNER_ID,
|
||||
user_data,
|
||||
queued_up,
|
||||
queued_dl,
|
||||
queue_dict_lock,
|
||||
)
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import new_task
|
||||
from ..helper.ext_utils.status_utils import get_task_by_gid
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
@ -44,7 +40,7 @@ async def remove_from_queue(_, message):
|
||||
await send_message(message, msg)
|
||||
return
|
||||
if (
|
||||
OWNER_ID != user_id
|
||||
Config.OWNER_ID != user_id
|
||||
and task.listener.user_id != user_id
|
||||
and (user_id not in user_data or not user_data[user_id].get("is_sudo"))
|
||||
):
|
||||
@ -74,12 +70,3 @@ async def remove_from_queue(_, message):
|
||||
msg = "Task have been force started to download and upload will start once download finish!"
|
||||
if msg:
|
||||
await send_message(message, msg)
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
remove_from_queue,
|
||||
filters=command(BotCommands.ForceStartCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
|
@ -1,13 +1,7 @@
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
|
||||
from bot import bot
|
||||
from ..helper.ext_utils.bot_utils import sync_to_async, new_task
|
||||
from ..helper.ext_utils.links_utils import is_gdrive_link
|
||||
from ..helper.ext_utils.status_utils import get_readable_file_size
|
||||
from ..helper.mirror_leech_utils.gdrive_utils.count import GoogleDriveCount
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import delete_message, send_message
|
||||
|
||||
|
||||
@ -48,10 +42,4 @@ async def count_node(_, message):
|
||||
await send_message(message, msg)
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
count_node,
|
||||
filters=command(BotCommands.CountCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
|
||||
|
@ -1,17 +1,12 @@
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
|
||||
from bot import bot, LOGGER
|
||||
from .. import LOGGER
|
||||
from ..helper.ext_utils.bot_utils import sync_to_async, new_task
|
||||
from ..helper.ext_utils.links_utils import is_gdrive_link
|
||||
from ..helper.mirror_leech_utils.gdrive_utils.delete import GoogleDriveDelete
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import auto_delete_message, send_message
|
||||
|
||||
|
||||
@new_task
|
||||
async def deletefile(_, message):
|
||||
async def delete_file(_, message):
|
||||
args = message.text.split()
|
||||
user = message.from_user or message.sender_chat
|
||||
if len(args) > 1:
|
||||
@ -31,10 +26,4 @@ async def deletefile(_, message):
|
||||
await auto_delete_message(message, reply_message)
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
deletefile,
|
||||
filters=command(BotCommands.DeleteCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
|
||||
|
@ -1,16 +1,11 @@
|
||||
from pyrogram.filters import command, regex
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
|
||||
from bot import LOGGER, bot, user_data
|
||||
from .. import LOGGER, user_data
|
||||
from ..helper.ext_utils.bot_utils import (
|
||||
sync_to_async,
|
||||
get_telegraph_list,
|
||||
new_task,
|
||||
)
|
||||
from ..helper.mirror_leech_utils.gdrive_utils.search import GoogleDriveSearch
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import send_message, edit_message
|
||||
|
||||
|
||||
@ -101,11 +96,4 @@ async def gdrive_search(_, message):
|
||||
await send_message(message, "Choose list options:", buttons)
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
gdrive_search,
|
||||
filters=command(BotCommands.ListCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(CallbackQueryHandler(select_type, filters=regex("^list_types")))
|
||||
|
||||
|
@ -1,7 +1,3 @@
|
||||
from pyrogram.filters import regex
|
||||
from pyrogram.handlers import CallbackQueryHandler
|
||||
|
||||
from bot import bot
|
||||
from ..helper.ext_utils.bot_utils import COMMAND_USAGE, new_task
|
||||
from ..helper.ext_utils.help_messages import (
|
||||
YT_HELP_DICT,
|
||||
@ -9,7 +5,8 @@ from ..helper.ext_utils.help_messages import (
|
||||
CLONE_HELP_DICT,
|
||||
)
|
||||
from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
from ..helper.telegram_helper.message_utils import edit_message, delete_message
|
||||
from ..helper.telegram_helper.message_utils import edit_message, delete_message, send_message
|
||||
from ..helper.ext_utils.help_messages import help_string
|
||||
|
||||
|
||||
@new_task
|
||||
@ -46,4 +43,6 @@ async def arg_usage(_, query):
|
||||
await edit_message(message, CLONE_HELP_DICT[data[2]], button)
|
||||
|
||||
|
||||
bot.add_handler(CallbackQueryHandler(arg_usage, filters=regex("^help")))
|
||||
@new_task
|
||||
async def bot_help(_, message):
|
||||
await send_message(message, help_string)
|
||||
|
@ -1,10 +1,9 @@
|
||||
from aiofiles.os import path as aiopath
|
||||
from base64 import b64encode
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler
|
||||
from re import match as re_match
|
||||
|
||||
from bot import bot, DOWNLOAD_DIR, LOGGER, bot_loop, task_dict_lock
|
||||
from .. import LOGGER, bot_loop, task_dict_lock
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import (
|
||||
get_content_type,
|
||||
sync_to_async,
|
||||
@ -40,8 +39,6 @@ from ..helper.mirror_leech_utils.download_utils.rclone_download import (
|
||||
from ..helper.mirror_leech_utils.download_utils.telegram_download import (
|
||||
TelegramDownloadHelper,
|
||||
)
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import send_message, get_tg_link_message
|
||||
|
||||
|
||||
@ -209,7 +206,7 @@ class Mirror(TaskListener):
|
||||
|
||||
await self.get_tag(text)
|
||||
|
||||
path = f"{DOWNLOAD_DIR}{self.mid}{self.folder_name}"
|
||||
path = f"{Config.DOWNLOAD_DIR}{self.mid}{self.folder_name}"
|
||||
|
||||
if not self.link and (reply_to := self.message.reply_to_message):
|
||||
if reply_to.text:
|
||||
@ -392,61 +389,3 @@ async def nzb_leech(client, message):
|
||||
bot_loop.create_task(
|
||||
Mirror(client, message, is_leech=True, is_nzb=True).new_event()
|
||||
)
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
mirror,
|
||||
filters=command(BotCommands.MirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
qb_mirror,
|
||||
filters=command(BotCommands.QbMirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
jd_mirror,
|
||||
filters=command(BotCommands.JdMirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
nzb_mirror,
|
||||
filters=command(BotCommands.NzbMirrorCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
leech,
|
||||
filters=command(BotCommands.LeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
qb_leech,
|
||||
filters=command(BotCommands.QbLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
jd_leech,
|
||||
filters=command(BotCommands.JdLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
nzb_leech,
|
||||
filters=command(BotCommands.NzbLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
|
96
bot/modules/restart.py
Normal file
96
bot/modules/restart.py
Normal file
@ -0,0 +1,96 @@
|
||||
from sys import executable
|
||||
from aiofiles import open as aiopen
|
||||
from aiofiles.os import path as aiopath, remove
|
||||
from asyncio import gather, create_subprocess_exec
|
||||
from os import execl as osexecl
|
||||
|
||||
from .. import intervals, scheduler, sabnzbd_client, LOGGER
|
||||
from ..helper.ext_utils.bot_utils import new_task, sync_to_async
|
||||
from ..helper.telegram_helper.message_utils import send_message
|
||||
from ..helper.ext_utils.db_handler import database
|
||||
from ..helper.ext_utils.files_utils import clean_all
|
||||
from ..core.mltb_client import TgClient
|
||||
from ..core.config_manager import Config
|
||||
|
||||
|
||||
@new_task
|
||||
async def restart_bot(_, message):
|
||||
intervals["stopAll"] = True
|
||||
restart_message = await send_message(message, "Restarting...")
|
||||
if scheduler.running:
|
||||
scheduler.shutdown(wait=False)
|
||||
if qb := intervals["qb"]:
|
||||
qb.cancel()
|
||||
if jd := intervals["jd"]:
|
||||
jd.cancel()
|
||||
if nzb := intervals["nzb"]:
|
||||
nzb.cancel()
|
||||
if st := intervals["status"]:
|
||||
for intvl in list(st.values()):
|
||||
intvl.cancel()
|
||||
await sync_to_async(clean_all)
|
||||
if sabnzbd_client.LOGGED_IN:
|
||||
await gather(
|
||||
sabnzbd_client.pause_all(),
|
||||
sabnzbd_client.purge_all(True),
|
||||
sabnzbd_client.delete_history("all", delete_files=True),
|
||||
)
|
||||
proc1 = await create_subprocess_exec(
|
||||
"pkill",
|
||||
"-9",
|
||||
"-f",
|
||||
"gunicorn|aria2c|qbittorrent-nox|ffmpeg|rclone|java|sabnzbdplus",
|
||||
)
|
||||
proc2 = await create_subprocess_exec("python3", "update.py")
|
||||
await gather(proc1.wait(), proc2.wait())
|
||||
async with aiopen(".restartmsg", "w") as f:
|
||||
await f.write(f"{restart_message.chat.id}\n{restart_message.id}\n")
|
||||
osexecl(executable, executable, "-m", "bot")
|
||||
|
||||
|
||||
async def restart_notification():
|
||||
if await aiopath.isfile(".restartmsg"):
|
||||
with open(".restartmsg") as f:
|
||||
chat_id, msg_id = map(int, f)
|
||||
else:
|
||||
chat_id, msg_id = 0, 0
|
||||
|
||||
async def send_incomplete_task_message(cid, msg):
|
||||
try:
|
||||
if msg.startswith("Restarted Successfully!"):
|
||||
await TgClient.bot.edit_message_text(
|
||||
chat_id=chat_id, message_id=msg_id, text=msg
|
||||
)
|
||||
await remove(".restartmsg")
|
||||
else:
|
||||
await TgClient.bot.send_message(
|
||||
chat_id=cid,
|
||||
text=msg,
|
||||
disable_web_page_preview=True,
|
||||
disable_notification=True,
|
||||
)
|
||||
except Exception as e:
|
||||
LOGGER.error(e)
|
||||
|
||||
if Config.INCOMPLETE_TASK_NOTIFIER and Config.DATABASE_URL:
|
||||
if notifier_dict := await database.get_incomplete_tasks():
|
||||
for cid, data in notifier_dict.items():
|
||||
msg = "Restarted Successfully!" if cid == chat_id else "Bot Restarted!"
|
||||
for tag, links in data.items():
|
||||
msg += f"\n\n{tag}: "
|
||||
for index, link in enumerate(links, start=1):
|
||||
msg += f" <a href='{link}'>{index}</a> |"
|
||||
if len(msg.encode()) > 4000:
|
||||
await send_incomplete_task_message(cid, msg)
|
||||
msg = ""
|
||||
if msg:
|
||||
await send_incomplete_task_message(cid, msg)
|
||||
|
||||
if await aiopath.isfile(".restartmsg"):
|
||||
try:
|
||||
await TgClient.bot.edit_message_text(
|
||||
chat_id=chat_id, message_id=msg_id, text="Restarted Successfully!"
|
||||
)
|
||||
except:
|
||||
pass
|
||||
await remove(".restartmsg")
|
@ -5,11 +5,12 @@ from datetime import datetime, timedelta
|
||||
from feedparser import parse as feed_parse
|
||||
from functools import partial
|
||||
from io import BytesIO
|
||||
from pyrogram.filters import command, regex, create
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
from pyrogram.filters import create
|
||||
from pyrogram.handlers import MessageHandler
|
||||
from time import time
|
||||
|
||||
from bot import scheduler, rss_dict, LOGGER, config_dict, bot
|
||||
from .. import scheduler, rss_dict, LOGGER
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import new_task, arg_parser
|
||||
from ..helper.ext_utils.db_handler import database
|
||||
from ..helper.ext_utils.exceptions import RssShutdownException
|
||||
@ -185,8 +186,7 @@ async def rss_sub(_, message, pre_event):
|
||||
except Exception as e:
|
||||
await send_message(message, str(e))
|
||||
if msg:
|
||||
if config_dict["DATABASE_URL"] and rss_dict[user_id]:
|
||||
await database.rss_update(user_id)
|
||||
await database.rss_update(user_id)
|
||||
await send_message(message, msg)
|
||||
is_sudo = await CustomFilters.sudo("", message)
|
||||
if scheduler.state == 2:
|
||||
@ -243,22 +243,21 @@ async def rss_update(_, message, pre_event, state):
|
||||
elif is_sudo and not scheduler.running:
|
||||
add_job()
|
||||
scheduler.start()
|
||||
if is_sudo and config_dict["DATABASE_URL"] and user_id != message.from_user.id:
|
||||
if is_sudo and Config.DATABASE_URL and user_id != message.from_user.id:
|
||||
await database.rss_update(user_id)
|
||||
if not rss_dict[user_id]:
|
||||
async with rss_dict_lock:
|
||||
del rss_dict[user_id]
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.rss_delete(user_id)
|
||||
if not rss_dict:
|
||||
await database.trunc_table("rss")
|
||||
await database.rss_delete(user_id)
|
||||
if not rss_dict:
|
||||
await database.trunc_table("rss")
|
||||
if updated:
|
||||
LOGGER.info(f"Rss link with Title(s): {updated} has been {state}d!")
|
||||
await send_message(
|
||||
message,
|
||||
f"Rss links with Title(s): <code>{updated}</code> has been {state}d!",
|
||||
)
|
||||
if config_dict["DATABASE_URL"] and rss_dict.get(user_id):
|
||||
if rss_dict.get(user_id):
|
||||
await database.rss_update(user_id)
|
||||
await update_rss_menu(pre_event)
|
||||
|
||||
@ -420,7 +419,7 @@ async def rss_edit(_, message, pre_event):
|
||||
y = x.split(" or ")
|
||||
exf_lists.append(y)
|
||||
rss_dict[user_id][title]["exf"] = exf_lists
|
||||
if config_dict["DATABASE_URL"] and updated:
|
||||
if updated:
|
||||
await database.rss_update(user_id)
|
||||
await update_rss_menu(pre_event)
|
||||
|
||||
@ -433,8 +432,7 @@ async def rss_delete(_, message, pre_event):
|
||||
user = int(user)
|
||||
async with rss_dict_lock:
|
||||
del rss_dict[user]
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.rss_delete(user)
|
||||
await database.rss_delete(user)
|
||||
await update_rss_menu(pre_event)
|
||||
|
||||
|
||||
@ -564,23 +562,20 @@ Timeout: 60 sec. Argument -c for command and arguments
|
||||
if data[1].endswith("unsub"):
|
||||
async with rss_dict_lock:
|
||||
del rss_dict[int(data[2])]
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.rss_delete(int(data[2]))
|
||||
await database.rss_delete(int(data[2]))
|
||||
await update_rss_menu(query)
|
||||
elif data[1].endswith("pause"):
|
||||
async with rss_dict_lock:
|
||||
for title in list(rss_dict[int(data[2])].keys()):
|
||||
rss_dict[int(data[2])][title]["paused"] = True
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.rss_update(int(data[2]))
|
||||
await database.rss_update(int(data[2]))
|
||||
elif data[1].endswith("resume"):
|
||||
async with rss_dict_lock:
|
||||
for title in list(rss_dict[int(data[2])].keys()):
|
||||
rss_dict[int(data[2])][title]["paused"] = False
|
||||
if scheduler.state == 2:
|
||||
scheduler.resume()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.rss_update(int(data[2]))
|
||||
await database.rss_update(int(data[2]))
|
||||
await update_rss_menu(query)
|
||||
elif data[1].startswith("all"):
|
||||
if len(rss_dict) == 0:
|
||||
@ -590,8 +585,7 @@ Timeout: 60 sec. Argument -c for command and arguments
|
||||
if data[1].endswith("unsub"):
|
||||
async with rss_dict_lock:
|
||||
rss_dict.clear()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.trunc_table("rss")
|
||||
await database.trunc_table("rss")
|
||||
await update_rss_menu(query)
|
||||
elif data[1].endswith("pause"):
|
||||
async with rss_dict_lock:
|
||||
@ -600,8 +594,7 @@ Timeout: 60 sec. Argument -c for command and arguments
|
||||
rss_dict[int(data[2])][title]["paused"] = True
|
||||
if scheduler.running:
|
||||
scheduler.pause()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.rss_update_all()
|
||||
await database.rss_update_all()
|
||||
elif data[1].endswith("resume"):
|
||||
async with rss_dict_lock:
|
||||
for user in list(rss_dict.keys()):
|
||||
@ -612,8 +605,7 @@ Timeout: 60 sec. Argument -c for command and arguments
|
||||
elif not scheduler.running:
|
||||
add_job()
|
||||
scheduler.start()
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.rss_update_all()
|
||||
await database.rss_update_all()
|
||||
elif data[1] == "deluser":
|
||||
if len(rss_dict) == 0:
|
||||
await query.answer(text="No subscriptions!", show_alert=True)
|
||||
@ -653,7 +645,7 @@ Timeout: 60 sec. Argument -c for command and arguments
|
||||
|
||||
|
||||
async def rss_monitor():
|
||||
chat = config_dict["RSS_CHAT"]
|
||||
chat = Config.RSS_CHAT
|
||||
if not chat:
|
||||
LOGGER.warning("RSS_CHAT not added! Shutting down rss scheduler...")
|
||||
scheduler.shutdown(wait=False)
|
||||
@ -788,7 +780,7 @@ async def rss_monitor():
|
||||
def add_job():
|
||||
scheduler.add_job(
|
||||
rss_monitor,
|
||||
trigger=IntervalTrigger(seconds=config_dict["RSS_DELAY"]),
|
||||
trigger=IntervalTrigger(seconds=Config.RSS_DELAY),
|
||||
id="0",
|
||||
name="RSS",
|
||||
misfire_grace_time=15,
|
||||
@ -801,11 +793,4 @@ def add_job():
|
||||
add_job()
|
||||
scheduler.start()
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
get_rss_menu,
|
||||
filters=command(BotCommands.RssCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(CallbackQueryHandler(rss_listener, filters=regex("^rss")))
|
||||
|
||||
|
@ -1,16 +1,13 @@
|
||||
from httpx import AsyncClient
|
||||
from html import escape
|
||||
from pyrogram.filters import command, regex
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
from urllib.parse import quote
|
||||
|
||||
from bot import bot, LOGGER, config_dict, qbittorrent_client
|
||||
from .. import LOGGER, qbittorrent_client
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import sync_to_async, new_task
|
||||
from ..helper.ext_utils.status_utils import get_readable_file_size
|
||||
from ..helper.ext_utils.telegraph_helper import telegraph
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import edit_message, send_message
|
||||
|
||||
PLUGINS = []
|
||||
@ -20,12 +17,14 @@ TELEGRAPH_LIMIT = 300
|
||||
|
||||
async def initiate_search_tools():
|
||||
qb_plugins = await sync_to_async(qbittorrent_client.search_plugins)
|
||||
if SEARCH_PLUGINS := config_dict["SEARCH_PLUGINS"]:
|
||||
if Config.SEARCH_PLUGINS:
|
||||
globals()["PLUGINS"] = []
|
||||
if qb_plugins:
|
||||
names = [plugin["name"] for plugin in qb_plugins]
|
||||
await sync_to_async(qbittorrent_client.search_uninstall_plugin, names=names)
|
||||
await sync_to_async(qbittorrent_client.search_install_plugin, SEARCH_PLUGINS)
|
||||
await sync_to_async(
|
||||
qbittorrent_client.search_install_plugin, Config.SEARCH_PLUGINS
|
||||
)
|
||||
elif qb_plugins:
|
||||
for plugin in qb_plugins:
|
||||
await sync_to_async(
|
||||
@ -33,11 +32,11 @@ async def initiate_search_tools():
|
||||
)
|
||||
globals()["PLUGINS"] = []
|
||||
|
||||
if SEARCH_API_LINK := config_dict["SEARCH_API_LINK"]:
|
||||
if Config.SEARCH_API_LINK:
|
||||
global SITES
|
||||
try:
|
||||
async with AsyncClient() as client:
|
||||
response = await client.get(f"{SEARCH_API_LINK}/api/v1/sites")
|
||||
response = await client.get(f"{Config.SEARCH_API_LINK}/api/v1/sites")
|
||||
data = response.json()
|
||||
SITES = {
|
||||
str(site): str(site).capitalize() for site in data["supported_sites"]
|
||||
@ -52,28 +51,24 @@ async def initiate_search_tools():
|
||||
|
||||
async def search(key, site, message, method):
|
||||
if method.startswith("api"):
|
||||
SEARCH_API_LINK = config_dict["SEARCH_API_LINK"]
|
||||
SEARCH_LIMIT = config_dict["SEARCH_LIMIT"]
|
||||
if method == "apisearch":
|
||||
LOGGER.info(f"API Searching: {key} from {site}")
|
||||
if site == "all":
|
||||
api = f"{SEARCH_API_LINK}/api/v1/all/search?query={key}&limit={SEARCH_LIMIT}"
|
||||
api = f"{Config.SEARCH_API_LINK}/api/v1/all/search?query={key}&limit={Config.SEARCH_LIMIT}"
|
||||
else:
|
||||
api = f"{SEARCH_API_LINK}/api/v1/search?site={site}&query={key}&limit={SEARCH_LIMIT}"
|
||||
api = f"{Config.SEARCH_API_LINK}/api/v1/search?site={site}&query={key}&limit={Config.SEARCH_LIMIT}"
|
||||
elif method == "apitrend":
|
||||
LOGGER.info(f"API Trending from {site}")
|
||||
if site == "all":
|
||||
api = f"{SEARCH_API_LINK}/api/v1/all/trending?limit={SEARCH_LIMIT}"
|
||||
api = f"{Config.SEARCH_API_LINK}/api/v1/all/trending?limit={Config.SEARCH_LIMIT}"
|
||||
else:
|
||||
api = f"{SEARCH_API_LINK}/api/v1/trending?site={site}&limit={SEARCH_LIMIT}"
|
||||
api = f"{Config.SEARCH_API_LINK}/api/v1/trending?site={site}&limit={Config.SEARCH_LIMIT}"
|
||||
elif method == "apirecent":
|
||||
LOGGER.info(f"API Recent from {site}")
|
||||
if site == "all":
|
||||
api = f"{SEARCH_API_LINK}/api/v1/all/recent?limit={SEARCH_LIMIT}"
|
||||
api = f"{Config.SEARCH_API_LINK}/api/v1/all/recent?limit={Config.SEARCH_LIMIT}"
|
||||
else:
|
||||
api = (
|
||||
f"{SEARCH_API_LINK}/api/v1/recent?site={site}&limit={SEARCH_LIMIT}"
|
||||
)
|
||||
api = f"{Config.SEARCH_API_LINK}/api/v1/recent?site={site}&limit={Config.SEARCH_LIMIT}"
|
||||
try:
|
||||
async with AsyncClient() as client:
|
||||
response = await client.get(api)
|
||||
@ -240,8 +235,7 @@ async def torrent_search(_, message):
|
||||
user_id = message.from_user.id
|
||||
buttons = ButtonMaker()
|
||||
key = message.text.split()
|
||||
SEARCH_PLUGINS = config_dict["SEARCH_PLUGINS"]
|
||||
if SITES is None and not SEARCH_PLUGINS:
|
||||
if SITES is None and not Config.SEARCH_PLUGINS:
|
||||
await send_message(
|
||||
message, "No API link or search PLUGINS added for this function"
|
||||
)
|
||||
@ -253,7 +247,7 @@ async def torrent_search(_, message):
|
||||
buttons.data_button("Cancel", f"torser {user_id} cancel")
|
||||
button = buttons.build_menu(2)
|
||||
await send_message(message, "Send a search key along with command", button)
|
||||
elif SITES is not None and SEARCH_PLUGINS:
|
||||
elif SITES is not None and Config.SEARCH_PLUGINS:
|
||||
buttons.data_button("Api", f"torser {user_id} apisearch")
|
||||
buttons.data_button("Plugins", f"torser {user_id} plugin")
|
||||
buttons.data_button("Cancel", f"torser {user_id} cancel")
|
||||
@ -312,13 +306,3 @@ async def torrent_search_update(_, query):
|
||||
else:
|
||||
await query.answer()
|
||||
await edit_message(message, "Search has been canceled!")
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
torrent_search,
|
||||
filters=command(BotCommands.SearchCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(CallbackQueryHandler(torrent_search_update, filters=regex("^torser")))
|
42
bot/modules/services.py
Normal file
42
bot/modules/services.py
Normal file
@ -0,0 +1,42 @@
|
||||
from time import time
|
||||
|
||||
from ..helper.ext_utils.bot_utils import new_task
|
||||
from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
from ..helper.telegram_helper.message_utils import send_message, edit_message, send_file
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
|
||||
|
||||
@new_task
|
||||
async def start(_, message):
|
||||
buttons = ButtonMaker()
|
||||
buttons.url_button(
|
||||
"Repo", "https://www.github.com/anasty17/mirror-leech-telegram-bot"
|
||||
)
|
||||
buttons.url_button("Code Owner", "https://t.me/anas_tayyar")
|
||||
reply_markup = buttons.build_menu(2)
|
||||
if await CustomFilters.authorized(_, message):
|
||||
start_string = f"""
|
||||
This bot can mirror from links|tgfiles|torrents|nzb|rclone-cloud to any rclone cloud, Google Drive or to telegram.
|
||||
Type /{BotCommands.HelpCommand} to get a list of available commands
|
||||
"""
|
||||
await send_message(message, start_string, reply_markup)
|
||||
else:
|
||||
await send_message(
|
||||
message,
|
||||
"This bot can mirror from links|tgfiles|torrents|nzb|rclone-cloud to any rclone cloud, Google Drive or to telegram.\n\n⚠️ You Are not authorized user! Deploy your own mirror-leech bot",
|
||||
reply_markup,
|
||||
)
|
||||
|
||||
|
||||
@new_task
|
||||
async def ping(_, message):
|
||||
start_time = int(round(time() * 1000))
|
||||
reply = await send_message(message, "Starting Ping")
|
||||
end_time = int(round(time() * 1000))
|
||||
await edit_message(reply, f"{end_time - start_time} ms")
|
||||
|
||||
|
||||
@new_task
|
||||
async def log(_, message):
|
||||
await send_file(message, "log.txt")
|
@ -1,16 +1,12 @@
|
||||
from io import BytesIO
|
||||
from pyrogram.filters import command
|
||||
from pyrogram.handlers import MessageHandler, EditedMessageHandler
|
||||
|
||||
from bot import LOGGER, bot
|
||||
from .. import LOGGER
|
||||
from ..helper.ext_utils.bot_utils import cmd_exec, new_task
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import send_message, send_file
|
||||
|
||||
|
||||
@new_task
|
||||
async def shell(_, message):
|
||||
async def run_shell(_, message):
|
||||
cmd = message.text.split(maxsplit=1)
|
||||
if len(cmd) == 1:
|
||||
await send_message(message, "No command to execute was given.")
|
||||
@ -32,19 +28,3 @@ async def shell(_, message):
|
||||
await send_message(message, reply)
|
||||
else:
|
||||
await send_message(message, "No Reply")
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
shell,
|
||||
filters=command(BotCommands.ShellCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
EditedMessageHandler(
|
||||
shell,
|
||||
filters=command(BotCommands.ShellCommand, case_sensitive=True)
|
||||
& CustomFilters.owner,
|
||||
)
|
||||
)
|
||||
|
97
bot/modules/stats.py
Normal file
97
bot/modules/stats.py
Normal file
@ -0,0 +1,97 @@
|
||||
from time import time
|
||||
from re import search as research
|
||||
from asyncio import gather
|
||||
from aiofiles.os import path as aiopath
|
||||
from psutil import (
|
||||
disk_usage,
|
||||
cpu_percent,
|
||||
swap_memory,
|
||||
cpu_count,
|
||||
virtual_memory,
|
||||
net_io_counters,
|
||||
boot_time,
|
||||
)
|
||||
|
||||
from .. import bot_start_time
|
||||
from ..helper.ext_utils.status_utils import get_readable_file_size, get_readable_time
|
||||
from ..helper.ext_utils.bot_utils import cmd_exec, new_task
|
||||
from ..helper.telegram_helper.message_utils import send_message
|
||||
|
||||
commands = {
|
||||
"aria2": (["aria2c", "--version"], r"aria2 version ([\d.]+)"),
|
||||
"qBittorrent": (["qbittorrent-nox", "--version"], r"qBittorrent v([\d.]+)"),
|
||||
"SABnzbd+": (["sabnzbdplus", "--version"], r"sabnzbdplus-([\d.]+)"),
|
||||
"python": (["python3", "--version"], r"Python ([\d.]+)"),
|
||||
"rclone": (["rclone", "--version"], r"rclone v([\d.]+)"),
|
||||
"yt-dlp": (["yt-dlp", "--version"], r"([\d.]+)"),
|
||||
"ffmpeg": (["ffmpeg", "-version"], r"ffmpeg version ([\d.]+(-\w+)?).*"),
|
||||
"7z": (["7z", "i"], r"7-Zip ([\d.]+)"),
|
||||
}
|
||||
|
||||
|
||||
@new_task
|
||||
async def bot_stats(_, message):
|
||||
total, used, free, disk = disk_usage("/")
|
||||
swap = swap_memory()
|
||||
memory = virtual_memory()
|
||||
stats = f"""
|
||||
<b>Commit Date:</b> {commands["commit"]}
|
||||
|
||||
<b>Bot Uptime:</b> {get_readable_time(time() - bot_start_time)}
|
||||
<b>OS Uptime:</b> {get_readable_time(time() - boot_time())}
|
||||
|
||||
<b>Total Disk Space:</b> {get_readable_file_size(total)}
|
||||
<b>Used:</b> {get_readable_file_size(used)} | <b>Free:</b> {get_readable_file_size(free)}
|
||||
|
||||
<b>Upload:</b> {get_readable_file_size(net_io_counters().bytes_sent)}
|
||||
<b>Download:</b> {get_readable_file_size(net_io_counters().bytes_recv)}
|
||||
|
||||
<b>CPU:</b> {cpu_percent(interval=0.5)}%
|
||||
<b>RAM:</b> {memory.percent}%
|
||||
<b>DISK:</b> {disk}%
|
||||
|
||||
<b>Physical Cores:</b> {cpu_count(logical=False)}
|
||||
<b>Total Cores:</b> {cpu_count(logical=True)}
|
||||
<b>SWAP:</b> {get_readable_file_size(swap.total)} | <b>Used:</b> {swap.percent}%
|
||||
|
||||
<b>Memory Total:</b> {get_readable_file_size(memory.total)}
|
||||
<b>Memory Free:</b> {get_readable_file_size(memory.available)}
|
||||
<b>Memory Used:</b> {get_readable_file_size(memory.used)}
|
||||
|
||||
<b>python:</b> {commands["python"]}
|
||||
<b>aria2:</b> {commands["aria2"]}
|
||||
<b>qBittorrent:</b> {commands["qBittorrent"]}
|
||||
<b>SABnzbd+:</b> {commands["SABnzbd+"]}
|
||||
<b>rclone:</b> {commands["rclone"]}
|
||||
<b>yt-dlp:</b> {commands["yt-dlp"]}
|
||||
<b>ffmpeg:</b> {commands["ffmpeg"]}
|
||||
<b>7z:</b> {commands["7z"]}
|
||||
"""
|
||||
await send_message(message, stats)
|
||||
|
||||
|
||||
async def get_version_async(command, regex):
|
||||
try:
|
||||
out, err, code = await cmd_exec(command)
|
||||
if code != 0:
|
||||
return f"Error: {err}"
|
||||
match = research(regex, out)
|
||||
return match.group(1) if match else "Version not found"
|
||||
except Exception as e:
|
||||
return f"Exception: {str(e)}"
|
||||
|
||||
|
||||
@new_task
|
||||
async def get_packages_version():
|
||||
tasks = [get_version_async(command, regex) for command, regex in commands.values()]
|
||||
versions = await gather(*tasks)
|
||||
for tool, version in zip(commands.keys(), versions):
|
||||
commands[tool] = version
|
||||
if await aiopath.exists(".git"):
|
||||
last_commit = await cmd_exec(
|
||||
"git log -1 --date=short --pretty=format:'%cd <b>From</b> %cr'", True
|
||||
)
|
||||
last_commit = last_commit[0]
|
||||
else:
|
||||
last_commit = "No UPSTREAM_REPO"
|
||||
commands["commit"] = last_commit
|
@ -1,17 +1,14 @@
|
||||
from psutil import cpu_percent, virtual_memory, disk_usage
|
||||
from pyrogram.filters import command, regex
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
from time import time
|
||||
|
||||
from bot import (
|
||||
from .. import (
|
||||
task_dict_lock,
|
||||
status_dict,
|
||||
task_dict,
|
||||
bot_start_time,
|
||||
DOWNLOAD_DIR,
|
||||
intervals,
|
||||
bot,
|
||||
)
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import sync_to_async, new_task
|
||||
from ..helper.ext_utils.status_utils import (
|
||||
MirrorStatus,
|
||||
@ -20,7 +17,6 @@ from ..helper.ext_utils.status_utils import (
|
||||
speed_string_to_bytes,
|
||||
)
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import (
|
||||
send_message,
|
||||
delete_message,
|
||||
@ -33,12 +29,12 @@ from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
|
||||
|
||||
@new_task
|
||||
async def mirror_status(_, message):
|
||||
async def task_status(_, message):
|
||||
async with task_dict_lock:
|
||||
count = len(task_dict)
|
||||
if count == 0:
|
||||
currentTime = get_readable_time(time() - bot_start_time)
|
||||
free = get_readable_file_size(disk_usage(DOWNLOAD_DIR).free)
|
||||
free = get_readable_file_size(disk_usage(Config.DOWNLOAD_DIR).free)
|
||||
msg = f"No Active Tasks!\nEach user can get status for his tasks by adding me or user_id after cmd: /{BotCommands.StatusCommand} me"
|
||||
msg += (
|
||||
f"\n<b>CPU:</b> {cpu_percent()}% | <b>FREE:</b> {free}"
|
||||
@ -154,13 +150,3 @@ async def status_pages(_, query):
|
||||
button = ButtonMaker()
|
||||
button.data_button("Back", f"status {data[1]} ref")
|
||||
await edit_message(message, msg, button.build_menu())
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
mirror_status,
|
||||
filters=command(BotCommands.StatusCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(CallbackQueryHandler(status_pages, filters=regex("^status")))
|
||||
|
@ -4,18 +4,13 @@ from functools import partial
|
||||
from html import escape
|
||||
from io import BytesIO
|
||||
from os import getcwd
|
||||
from pyrogram.filters import command, regex, create
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
from pyrogram.filters import create
|
||||
from pyrogram.handlers import MessageHandler
|
||||
from time import time
|
||||
|
||||
from bot import (
|
||||
bot,
|
||||
IS_PREMIUM_USER,
|
||||
user_data,
|
||||
config_dict,
|
||||
MAX_SPLIT_SIZE,
|
||||
global_extension_filter,
|
||||
)
|
||||
from .. import user_data, extension_filter
|
||||
from ..core.config_manager import Config
|
||||
from ..core.mltb_client import TgClient
|
||||
from ..helper.ext_utils.bot_utils import (
|
||||
update_user_ldata,
|
||||
new_task,
|
||||
@ -23,9 +18,7 @@ from ..helper.ext_utils.bot_utils import (
|
||||
)
|
||||
from ..helper.ext_utils.db_handler import database
|
||||
from ..helper.ext_utils.media_utils import create_thumb
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import (
|
||||
send_message,
|
||||
edit_message,
|
||||
@ -48,7 +41,7 @@ async def get_user_settings(from_user):
|
||||
if (
|
||||
user_dict.get("as_doc", False)
|
||||
or "as_doc" not in user_dict
|
||||
and config_dict["AS_DOCUMENT"]
|
||||
and Config.AS_DOCUMENT
|
||||
):
|
||||
ltype = "DOCUMENT"
|
||||
else:
|
||||
@ -59,12 +52,12 @@ async def get_user_settings(from_user):
|
||||
if user_dict.get("split_size", False):
|
||||
split_size = user_dict["split_size"]
|
||||
else:
|
||||
split_size = config_dict["LEECH_SPLIT_SIZE"]
|
||||
split_size = Config.LEECH_SPLIT_SIZE
|
||||
|
||||
if (
|
||||
user_dict.get("equal_splits", False)
|
||||
or "equal_splits" not in user_dict
|
||||
and config_dict["EQUAL_SPLITS"]
|
||||
and Config.EQUAL_SPLITS
|
||||
):
|
||||
equal_splits = "Enabled"
|
||||
else:
|
||||
@ -73,7 +66,7 @@ async def get_user_settings(from_user):
|
||||
if (
|
||||
user_dict.get("media_group", False)
|
||||
or "media_group" not in user_dict
|
||||
and config_dict["MEDIA_GROUP"]
|
||||
and Config.MEDIA_GROUP
|
||||
):
|
||||
media_group = "Enabled"
|
||||
else:
|
||||
@ -81,33 +74,33 @@ async def get_user_settings(from_user):
|
||||
|
||||
if user_dict.get("lprefix", False):
|
||||
lprefix = user_dict["lprefix"]
|
||||
elif "lprefix" not in user_dict and config_dict["LEECH_FILENAME_PREFIX"]:
|
||||
lprefix = config_dict["LEECH_FILENAME_PREFIX"]
|
||||
elif "lprefix" not in user_dict and Config.LEECH_FILENAME_PREFIX:
|
||||
lprefix = Config.LEECH_FILENAME_PREFIX
|
||||
else:
|
||||
lprefix = "None"
|
||||
|
||||
if user_dict.get("leech_dest", False):
|
||||
leech_dest = user_dict["leech_dest"]
|
||||
elif "leech_dest" not in user_dict and config_dict["LEECH_DUMP_CHAT"]:
|
||||
leech_dest = config_dict["LEECH_DUMP_CHAT"]
|
||||
elif "leech_dest" not in user_dict and Config.LEECH_DUMP_CHAT:
|
||||
leech_dest = Config.LEECH_DUMP_CHAT
|
||||
else:
|
||||
leech_dest = "None"
|
||||
|
||||
if (
|
||||
IS_PREMIUM_USER
|
||||
TgClient.IS_PREMIUM_USER
|
||||
and user_dict.get("user_transmission", False)
|
||||
or "user_transmission" not in user_dict
|
||||
and config_dict["USER_TRANSMISSION"]
|
||||
and Config.USER_TRANSMISSION
|
||||
):
|
||||
leech_method = "user"
|
||||
else:
|
||||
leech_method = "bot"
|
||||
|
||||
if (
|
||||
IS_PREMIUM_USER
|
||||
TgClient.IS_PREMIUM_USER
|
||||
and user_dict.get("mixed_leech", False)
|
||||
or "mixed_leech" not in user_dict
|
||||
and config_dict["MIXED_LEECH"]
|
||||
and Config.MIXED_LEECH
|
||||
):
|
||||
mixed_leech = "Enabled"
|
||||
else:
|
||||
@ -115,8 +108,8 @@ async def get_user_settings(from_user):
|
||||
|
||||
if user_dict.get("thumb_layout", False):
|
||||
thumb_layout = user_dict["thumb_layout"]
|
||||
elif "thumb_layout" not in user_dict and config_dict["THUMBNAIL_LAYOUT"]:
|
||||
thumb_layout = config_dict["THUMBNAIL_LAYOUT"]
|
||||
elif "thumb_layout" not in user_dict and Config.THUMBNAIL_LAYOUT:
|
||||
thumb_layout = Config.THUMBNAIL_LAYOUT
|
||||
else:
|
||||
thumb_layout = "None"
|
||||
|
||||
@ -126,7 +119,7 @@ async def get_user_settings(from_user):
|
||||
rccmsg = "Exists" if await aiopath.exists(rclone_conf) else "Not Exists"
|
||||
if user_dict.get("rclone_path", False):
|
||||
rccpath = user_dict["rclone_path"]
|
||||
elif RP := config_dict["RCLONE_PATH"]:
|
||||
elif RP := Config.RCLONE_PATH:
|
||||
rccpath = RP
|
||||
else:
|
||||
rccpath = "None"
|
||||
@ -135,7 +128,7 @@ async def get_user_settings(from_user):
|
||||
tokenmsg = "Exists" if await aiopath.exists(token_pickle) else "Not Exists"
|
||||
if user_dict.get("gdrive_id", False):
|
||||
gdrive_id = user_dict["gdrive_id"]
|
||||
elif GI := config_dict["GDRIVE_ID"]:
|
||||
elif GI := Config.GDRIVE_ID:
|
||||
gdrive_id = GI
|
||||
else:
|
||||
gdrive_id = "None"
|
||||
@ -143,7 +136,7 @@ async def get_user_settings(from_user):
|
||||
if (
|
||||
user_dict.get("stop_duplicate", False)
|
||||
or "stop_duplicate" not in user_dict
|
||||
and config_dict["STOP_DUPLICATE"]
|
||||
and Config.STOP_DUPLICATE
|
||||
):
|
||||
sd_msg = "Enabled"
|
||||
else:
|
||||
@ -152,9 +145,7 @@ async def get_user_settings(from_user):
|
||||
upload_paths = "Added" if user_dict.get("upload_paths", False) else "None"
|
||||
buttons.data_button("Upload Paths", f"userset {user_id} upload_paths")
|
||||
|
||||
default_upload = (
|
||||
user_dict.get("default_upload", "") or config_dict["DEFAULT_UPLOAD"]
|
||||
)
|
||||
default_upload = user_dict.get("default_upload", "") or Config.DEFAULT_UPLOAD
|
||||
du = "Gdrive API" if default_upload == "gd" else "Rclone"
|
||||
dur = "Gdrive API" if default_upload != "gd" else "Rclone"
|
||||
buttons.data_button(f"Upload using {dur}", f"userset {user_id} {default_upload}")
|
||||
@ -169,8 +160,8 @@ async def get_user_settings(from_user):
|
||||
buttons.data_button("Excluded Extensions", f"userset {user_id} ex_ex")
|
||||
if user_dict.get("excluded_extensions", False):
|
||||
ex_ex = user_dict["excluded_extensions"]
|
||||
elif "excluded_extensions" not in user_dict and global_extension_filter:
|
||||
ex_ex = global_extension_filter
|
||||
elif "excluded_extensions" not in user_dict and extension_filter:
|
||||
ex_ex = extension_filter
|
||||
else:
|
||||
ex_ex = "None"
|
||||
|
||||
@ -180,16 +171,16 @@ async def get_user_settings(from_user):
|
||||
buttons.data_button("YT-DLP Options", f"userset {user_id} yto")
|
||||
if user_dict.get("yt_opt", False):
|
||||
ytopt = user_dict["yt_opt"]
|
||||
elif "yt_opt" not in user_dict and config_dict["YT_DLP_OPTIONS"]:
|
||||
ytopt = config_dict["YT_DLP_OPTIONS"]
|
||||
elif "yt_opt" not in user_dict and Config.YT_DLP_OPTIONS:
|
||||
ytopt = Config.YT_DLP_OPTIONS
|
||||
else:
|
||||
ytopt = "None"
|
||||
|
||||
buttons.data_button("Ffmpeg Cmds", f"userset {user_id} ffc")
|
||||
if user_dict.get("ffmpeg_cmds", False):
|
||||
ffc = user_dict["ffmpeg_cmds"]
|
||||
elif "ffmpeg_cmds" not in user_dict and config_dict["FFMPEG_CMDS"]:
|
||||
ffc = config_dict["FFMPEG_CMDS"]
|
||||
elif "ffmpeg_cmds" not in user_dict and Config.FFMPEG_CMDS:
|
||||
ffc = Config.FFMPEG_CMDS
|
||||
else:
|
||||
ffc = "None"
|
||||
|
||||
@ -232,7 +223,7 @@ async def update_user_settings(query):
|
||||
|
||||
|
||||
@new_task
|
||||
async def user_settings(_, message):
|
||||
async def send_user_settings(_, message):
|
||||
from_user = message.from_user
|
||||
handler_dict[from_user.id] = False
|
||||
msg, button = await get_user_settings(from_user)
|
||||
@ -247,8 +238,7 @@ async def set_thumb(_, message, pre_event):
|
||||
update_user_ldata(user_id, "thumb", des_dir)
|
||||
await delete_message(message)
|
||||
await update_user_settings(pre_event)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_doc(user_id, "thumb", des_dir)
|
||||
await database.update_user_doc(user_id, "thumb", des_dir)
|
||||
|
||||
|
||||
@new_task
|
||||
@ -262,8 +252,7 @@ async def add_rclone(_, message, pre_event):
|
||||
update_user_ldata(user_id, "rclone_config", f"rclone/{user_id}.conf")
|
||||
await delete_message(message)
|
||||
await update_user_settings(pre_event)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_doc(user_id, "rclone_config", des_dir)
|
||||
await database.update_user_doc(user_id, "rclone_config", des_dir)
|
||||
|
||||
|
||||
@new_task
|
||||
@ -277,8 +266,7 @@ async def add_token_pickle(_, message, pre_event):
|
||||
update_user_ldata(user_id, "token_pickle", f"tokens/{user_id}.pickle")
|
||||
await delete_message(message)
|
||||
await update_user_settings(pre_event)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_doc(user_id, "token_pickle", des_dir)
|
||||
await database.update_user_doc(user_id, "token_pickle", des_dir)
|
||||
|
||||
|
||||
@new_task
|
||||
@ -294,8 +282,7 @@ async def delete_path(_, message, pre_event):
|
||||
update_user_ldata(user_id, "upload_paths", new_value)
|
||||
await delete_message(message)
|
||||
await update_user_settings(pre_event)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_doc(user_id, "upload_paths", new_value)
|
||||
await database.update_user_doc(user_id, "upload_paths", new_value)
|
||||
|
||||
|
||||
@new_task
|
||||
@ -306,7 +293,7 @@ async def set_option(_, message, pre_event, option):
|
||||
if option == "split_size":
|
||||
if not value.isdigit():
|
||||
value = get_size_bytes(value)
|
||||
value = min(int(value), MAX_SPLIT_SIZE)
|
||||
value = min(int(value), TgClient.MAX_SPLIT_SIZE)
|
||||
elif option == "excluded_extensions":
|
||||
fx = value.split()
|
||||
value = ["aria2", "!qB"]
|
||||
@ -341,8 +328,7 @@ async def set_option(_, message, pre_event, option):
|
||||
update_user_ldata(user_id, option, value)
|
||||
await delete_message(message)
|
||||
await update_user_settings(pre_event)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
await database.update_user_data(user_id)
|
||||
|
||||
|
||||
async def event_handler(client, query, pfunc, photo=False, document=False):
|
||||
@ -399,8 +385,7 @@ async def edit_user_settings(client, query):
|
||||
update_user_ldata(user_id, data[2], data[3] == "true")
|
||||
await query.answer()
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
await database.update_user_data(user_id)
|
||||
elif data[2] in ["thumb", "rclone_config", "token_pickle"]:
|
||||
if data[2] == "thumb":
|
||||
fpath = thumb_path
|
||||
@ -413,8 +398,7 @@ async def edit_user_settings(client, query):
|
||||
await remove(fpath)
|
||||
update_user_ldata(user_id, data[2], "")
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_doc(user_id, data[2])
|
||||
await database.update_user_doc(user_id, data[2])
|
||||
else:
|
||||
await query.answer("Old Settings", show_alert=True)
|
||||
await update_user_settings(query)
|
||||
@ -430,15 +414,13 @@ async def edit_user_settings(client, query):
|
||||
await query.answer()
|
||||
update_user_ldata(user_id, data[2], "")
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
await database.update_user_data(user_id)
|
||||
elif data[2] in ["split_size", "leech_dest", "rclone_path", "gdrive_id"]:
|
||||
await query.answer()
|
||||
if data[2] in user_data.get(user_id, {}):
|
||||
del user_data[user_id][data[2]]
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
await database.update_user_data(user_id)
|
||||
elif data[2] == "leech":
|
||||
await query.answer()
|
||||
thumbpath = f"Thumbnails/{user_id}.jpg"
|
||||
@ -449,25 +431,25 @@ async def edit_user_settings(client, query):
|
||||
if user_dict.get("split_size", False):
|
||||
split_size = user_dict["split_size"]
|
||||
else:
|
||||
split_size = config_dict["LEECH_SPLIT_SIZE"]
|
||||
split_size = Config.LEECH_SPLIT_SIZE
|
||||
buttons.data_button("Leech Destination", f"userset {user_id} ldest")
|
||||
if user_dict.get("leech_dest", False):
|
||||
leech_dest = user_dict["leech_dest"]
|
||||
elif "leech_dest" not in user_dict and config_dict["LEECH_DUMP_CHAT"]:
|
||||
leech_dest = config_dict["LEECH_DUMP_CHAT"]
|
||||
elif "leech_dest" not in user_dict and Config.LEECH_DUMP_CHAT:
|
||||
leech_dest = Config.LEECH_DUMP_CHAT
|
||||
else:
|
||||
leech_dest = "None"
|
||||
buttons.data_button("Leech Prefix", f"userset {user_id} leech_prefix")
|
||||
if user_dict.get("lprefix", False):
|
||||
lprefix = user_dict["lprefix"]
|
||||
elif "lprefix" not in user_dict and config_dict["LEECH_FILENAME_PREFIX"]:
|
||||
lprefix = config_dict["LEECH_FILENAME_PREFIX"]
|
||||
elif "lprefix" not in user_dict and Config.LEECH_FILENAME_PREFIX:
|
||||
lprefix = Config.LEECH_FILENAME_PREFIX
|
||||
else:
|
||||
lprefix = "None"
|
||||
if (
|
||||
user_dict.get("as_doc", False)
|
||||
or "as_doc" not in user_dict
|
||||
and config_dict["AS_DOCUMENT"]
|
||||
and Config.AS_DOCUMENT
|
||||
):
|
||||
ltype = "DOCUMENT"
|
||||
buttons.data_button("Send As Media", f"userset {user_id} as_doc false")
|
||||
@ -477,7 +459,7 @@ async def edit_user_settings(client, query):
|
||||
if (
|
||||
user_dict.get("equal_splits", False)
|
||||
or "equal_splits" not in user_dict
|
||||
and config_dict["EQUAL_SPLITS"]
|
||||
and Config.EQUAL_SPLITS
|
||||
):
|
||||
buttons.data_button(
|
||||
"Disable Equal Splits", f"userset {user_id} equal_splits false"
|
||||
@ -491,7 +473,7 @@ async def edit_user_settings(client, query):
|
||||
if (
|
||||
user_dict.get("media_group", False)
|
||||
or "media_group" not in user_dict
|
||||
and config_dict["MEDIA_GROUP"]
|
||||
and Config.MEDIA_GROUP
|
||||
):
|
||||
buttons.data_button(
|
||||
"Disable Media Group", f"userset {user_id} media_group false"
|
||||
@ -503,16 +485,16 @@ async def edit_user_settings(client, query):
|
||||
)
|
||||
media_group = "Disabled"
|
||||
if (
|
||||
IS_PREMIUM_USER
|
||||
TgClient.IS_PREMIUM_USER
|
||||
and user_dict.get("user_transmission", False)
|
||||
or "user_transmission" not in user_dict
|
||||
and config_dict["USER_TRANSMISSION"]
|
||||
and Config.USER_TRANSMISSION
|
||||
):
|
||||
buttons.data_button(
|
||||
"Leech by Bot", f"userset {user_id} user_transmission false"
|
||||
)
|
||||
leech_method = "user"
|
||||
elif IS_PREMIUM_USER:
|
||||
elif TgClient.IS_PREMIUM_USER:
|
||||
leech_method = "bot"
|
||||
buttons.data_button(
|
||||
"Leech by User", f"userset {user_id} user_transmission true"
|
||||
@ -521,16 +503,16 @@ async def edit_user_settings(client, query):
|
||||
leech_method = "bot"
|
||||
|
||||
if (
|
||||
IS_PREMIUM_USER
|
||||
TgClient.IS_PREMIUM_USER
|
||||
and user_dict.get("mixed_leech", False)
|
||||
or "mixed_leech" not in user_dict
|
||||
and config_dict["MIXED_LEECH"]
|
||||
and Config.MIXED_LEECH
|
||||
):
|
||||
mixed_leech = "Enabled"
|
||||
buttons.data_button(
|
||||
"Disable Mixed Leech", f"userset {user_id} mixed_leech false"
|
||||
)
|
||||
elif IS_PREMIUM_USER:
|
||||
elif TgClient.IS_PREMIUM_USER:
|
||||
mixed_leech = "Disabled"
|
||||
buttons.data_button(
|
||||
"Enable Mixed Leech", f"userset {user_id} mixed_leech true"
|
||||
@ -541,8 +523,8 @@ async def edit_user_settings(client, query):
|
||||
buttons.data_button("Thumbnail Layout", f"userset {user_id} tlayout")
|
||||
if user_dict.get("thumb_layout", False):
|
||||
thumb_layout = user_dict["thumb_layout"]
|
||||
elif "thumb_layout" not in user_dict and config_dict["THUMBNAIL_LAYOUT"]:
|
||||
thumb_layout = config_dict["THUMBNAIL_LAYOUT"]
|
||||
elif "thumb_layout" not in user_dict and Config.THUMBNAIL_LAYOUT:
|
||||
thumb_layout = Config.THUMBNAIL_LAYOUT
|
||||
else:
|
||||
thumb_layout = "None"
|
||||
|
||||
@ -571,7 +553,7 @@ Thumbnail Layout is <b>{thumb_layout}</b>
|
||||
rccmsg = "Exists" if await aiopath.exists(rclone_conf) else "Not Exists"
|
||||
if user_dict.get("rclone_path", False):
|
||||
rccpath = user_dict["rclone_path"]
|
||||
elif RP := config_dict["RCLONE_PATH"]:
|
||||
elif RP := Config.RCLONE_PATH:
|
||||
rccpath = RP
|
||||
else:
|
||||
rccpath = "None"
|
||||
@ -588,7 +570,7 @@ Rclone Path is <code>{rccpath}</code>"""
|
||||
if (
|
||||
user_dict.get("stop_duplicate", False)
|
||||
or "stop_duplicate" not in user_dict
|
||||
and config_dict["STOP_DUPLICATE"]
|
||||
and Config.STOP_DUPLICATE
|
||||
):
|
||||
buttons.data_button(
|
||||
"Disable Stop Duplicate", f"userset {user_id} stop_duplicate false"
|
||||
@ -604,7 +586,7 @@ Rclone Path is <code>{rccpath}</code>"""
|
||||
tokenmsg = "Exists" if await aiopath.exists(token_pickle) else "Not Exists"
|
||||
if user_dict.get("gdrive_id", False):
|
||||
gdrive_id = user_dict["gdrive_id"]
|
||||
elif GDID := config_dict["GDRIVE_ID"]:
|
||||
elif GDID := Config.GDRIVE_ID:
|
||||
gdrive_id = GDID
|
||||
else:
|
||||
gdrive_id = "None"
|
||||
@ -637,7 +619,7 @@ Stop Duplicate is <b>{sd_msg}</b>"""
|
||||
elif data[2] == "yto":
|
||||
await query.answer()
|
||||
buttons = ButtonMaker()
|
||||
if user_dict.get("yt_opt", False) or config_dict["YT_DLP_OPTIONS"]:
|
||||
if user_dict.get("yt_opt", False) or Config.YT_DLP_OPTIONS:
|
||||
buttons.data_button(
|
||||
"Remove YT-DLP Options", f"userset {user_id} yt_opt", "header"
|
||||
)
|
||||
@ -655,7 +637,7 @@ Check all yt-dlp api options from this <a href='https://github.com/yt-dlp/yt-dlp
|
||||
elif data[2] == "ffc":
|
||||
await query.answer()
|
||||
buttons = ButtonMaker()
|
||||
if user_dict.get("ffmpeg_cmds", False) or config_dict["FFMPEG_CMDS"]:
|
||||
if user_dict.get("ffmpeg_cmds", False) or Config.FFMPEG_CMDS:
|
||||
buttons.data_button(
|
||||
"Remove FFMPEG Commands",
|
||||
f"userset {user_id} ffmpeg_cmds",
|
||||
@ -665,10 +647,9 @@ Check all yt-dlp api options from this <a href='https://github.com/yt-dlp/yt-dlp
|
||||
buttons.data_button("Close", f"userset {user_id} close")
|
||||
rmsg = """list of lists of ffmpeg commands. You can set multiple ffmpeg commands for all files before upload. Don't write ffmpeg at beginning, start directly with the arguments.
|
||||
Notes:
|
||||
1. Add <code>-del</code> to the list(s) which you want from the bot to delete the original files after command run complete!
|
||||
1. Add <code>-del</code> to the list which you want from the bot to delete the original files after command run complete!
|
||||
2. Seed will get disbaled while using this option
|
||||
3. It must be list of list(s) event of one list added like [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv", "-del"]]
|
||||
Examples: [["-i", "mltb.mkv", "-c", "copy", "-c:s", "srt", "mltb.mkv", "-del"], ["-i", "mltb.video", "-c", "copy", "-c:s", "srt", "mltb"], ["-i", "mltb.m4a", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"], ["-i", "mltb.audio", "-c:a", "libmp3lame", "-q:a", "2", "mltb.mp3"]]
|
||||
Examples: ["-i mltb.mkv -c copy -c:s srt mltb.mkv", "-i mltb.video -c copy -c:s srt mltb", "-i mltb.m4a -c:a libmp3lame -q:a 2 mltb.mp3", "-i mltb.audio -c:a libmp3lame -q:a 2 mltb.mp3"]
|
||||
Here I will explain how to use mltb.* which is reference to files you want to work on.
|
||||
1. First cmd: the input is mltb.mkv so this cmd will work only on mkv videos and the output is mltb.mkv also so all outputs is mkv. -del will delete the original media after complete run of the cmd.
|
||||
2. Second cmd: the input is mltb.video so this cmd will work on all videos and the output is only mltb so the extenstion is same as input files.
|
||||
@ -686,7 +667,7 @@ Here I will explain how to use mltb.* which is reference to files you want to wo
|
||||
buttons.data_button("Close", f"userset {user_id} close")
|
||||
await edit_message(
|
||||
message,
|
||||
f"Send Leech split size in bytes. IS_PREMIUM_USER: {IS_PREMIUM_USER}. Timeout: 60 sec",
|
||||
f"Send Leech split size in bytes. IS_PREMIUM_USER: {TgClient.IS_PREMIUM_USER}. Timeout: 60 sec",
|
||||
buttons.build_menu(1),
|
||||
)
|
||||
pfunc = partial(set_option, pre_event=query, option="split_size")
|
||||
@ -758,7 +739,7 @@ Here I will explain how to use mltb.* which is reference to files you want to wo
|
||||
if (
|
||||
user_dict.get("lprefix", False)
|
||||
or "lprefix" not in user_dict
|
||||
and config_dict["LEECH_FILENAME_PREFIX"]
|
||||
and Config.LEECH_FILENAME_PREFIX
|
||||
):
|
||||
buttons.data_button("Remove Leech Prefix", f"userset {user_id} lprefix")
|
||||
buttons.data_button("Back", f"userset {user_id} leech")
|
||||
@ -776,7 +757,7 @@ Here I will explain how to use mltb.* which is reference to files you want to wo
|
||||
if (
|
||||
user_dict.get("leech_dest", False)
|
||||
or "leech_dest" not in user_dict
|
||||
and config_dict["LEECH_DUMP_CHAT"]
|
||||
and Config.LEECH_DUMP_CHAT
|
||||
):
|
||||
buttons.data_button(
|
||||
"Reset Leech Destination", f"userset {user_id} leech_dest"
|
||||
@ -796,7 +777,7 @@ Here I will explain how to use mltb.* which is reference to files you want to wo
|
||||
if (
|
||||
user_dict.get("thumb_layout", False)
|
||||
or "thumb_layout" not in user_dict
|
||||
and config_dict["THUMBNAIL_LAYOUT"]
|
||||
and Config.THUMBNAIL_LAYOUT
|
||||
):
|
||||
buttons.data_button(
|
||||
"Reset Thumbnail Layout", f"userset {user_id} thumb_layout"
|
||||
@ -816,7 +797,7 @@ Here I will explain how to use mltb.* which is reference to files you want to wo
|
||||
if (
|
||||
user_dict.get("excluded_extensions", False)
|
||||
or "excluded_extensions" not in user_dict
|
||||
and global_extension_filter
|
||||
and extension_filter
|
||||
):
|
||||
buttons.data_button(
|
||||
"Remove Excluded Extensions", f"userset {user_id} excluded_extensions"
|
||||
@ -861,15 +842,13 @@ Example: script/code/s | mirror/leech | tea/ /s | clone | cpu/ | \[mltb\]/mltb |
|
||||
du = "rc" if data[2] == "gd" else "gd"
|
||||
update_user_ldata(user_id, "default_upload", du)
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
await database.update_user_data(user_id)
|
||||
elif data[2] == "user_tokens":
|
||||
await query.answer()
|
||||
tr = data[3].lower() == "false"
|
||||
update_user_ldata(user_id, "user_tokens", tr)
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
await database.update_user_data(user_id)
|
||||
elif data[2] == "upload_paths":
|
||||
await query.answer()
|
||||
buttons = ButtonMaker()
|
||||
@ -933,8 +912,7 @@ Example: script/code/s | mirror/leech | tea/ /s | clone | cpu/ | \[mltb\]/mltb |
|
||||
else:
|
||||
user_data[user_id].clear()
|
||||
await update_user_settings(query)
|
||||
if config_dict["DATABASE_URL"]:
|
||||
await database.update_user_data(user_id)
|
||||
await database.update_user_data(user_id)
|
||||
for fpath in [thumb_path, rclone_conf, token_pickle]:
|
||||
if await aiopath.exists(fpath):
|
||||
await remove(fpath)
|
||||
@ -948,7 +926,7 @@ Example: script/code/s | mirror/leech | tea/ /s | clone | cpu/ | \[mltb\]/mltb |
|
||||
|
||||
|
||||
@new_task
|
||||
async def send_users_settings(_, message):
|
||||
async def get_users_settings(_, message):
|
||||
if user_data:
|
||||
msg = ""
|
||||
for u, d in user_data.items():
|
||||
@ -967,20 +945,3 @@ async def send_users_settings(_, message):
|
||||
await send_message(message, msg)
|
||||
else:
|
||||
await send_message(message, "No users data!")
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
send_users_settings,
|
||||
filters=command(BotCommands.UsersCommand, case_sensitive=True)
|
||||
& CustomFilters.sudo,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
user_settings,
|
||||
filters=command(BotCommands.UserSetCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(CallbackQueryHandler(edit_user_settings, filters=regex("^userset")))
|
||||
|
@ -1,12 +1,13 @@
|
||||
from httpx import AsyncClient
|
||||
from asyncio import wait_for, Event
|
||||
from functools import partial
|
||||
from pyrogram.filters import command, regex, user
|
||||
from pyrogram.handlers import MessageHandler, CallbackQueryHandler
|
||||
from pyrogram.filters import regex, user
|
||||
from pyrogram.handlers import CallbackQueryHandler
|
||||
from time import time
|
||||
from yt_dlp import YoutubeDL
|
||||
|
||||
from bot import DOWNLOAD_DIR, bot, config_dict, LOGGER, bot_loop, task_dict_lock
|
||||
from .. import LOGGER, bot_loop, task_dict_lock
|
||||
from ..core.config_manager import Config
|
||||
from ..helper.ext_utils.bot_utils import (
|
||||
new_task,
|
||||
sync_to_async,
|
||||
@ -17,9 +18,7 @@ from ..helper.ext_utils.links_utils import is_url
|
||||
from ..helper.ext_utils.status_utils import get_readable_file_size, get_readable_time
|
||||
from ..helper.listeners.task_listener import TaskListener
|
||||
from ..helper.mirror_leech_utils.download_utils.yt_dlp_download import YoutubeDLHelper
|
||||
from ..helper.telegram_helper.bot_commands import BotCommands
|
||||
from ..helper.telegram_helper.button_build import ButtonMaker
|
||||
from ..helper.telegram_helper.filters import CustomFilters
|
||||
from ..helper.telegram_helper.message_utils import (
|
||||
send_message,
|
||||
edit_message,
|
||||
@ -398,11 +397,11 @@ class YtDlp(TaskListener):
|
||||
if len(self.bulk) != 0:
|
||||
del self.bulk[0]
|
||||
|
||||
path = f"{DOWNLOAD_DIR}{self.mid}{self.folder_name}"
|
||||
path = f"{Config.DOWNLOAD_DIR}{self.mid}{self.folder_name}"
|
||||
|
||||
await self.get_tag(text)
|
||||
|
||||
opt = opt or self.user_dict.get("yt_opt") or config_dict["YT_DLP_OPTIONS"]
|
||||
opt = opt or self.user_dict.get("yt_opt") or Config.YT_DLP_OPTIONS
|
||||
|
||||
if not self.link and (reply_to := self.message.reply_to_message):
|
||||
self.link = reply_to.text.split("\n", 1)[0].strip()
|
||||
@ -481,19 +480,3 @@ async def ytdl(client, message):
|
||||
|
||||
async def ytdl_leech(client, message):
|
||||
bot_loop.create_task(YtDlp(client, message, is_leech=True).new_event())
|
||||
|
||||
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
ytdl,
|
||||
filters=command(BotCommands.YtdlCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
bot.add_handler(
|
||||
MessageHandler(
|
||||
ytdl_leech,
|
||||
filters=command(BotCommands.YtdlLeechCommand, case_sensitive=True)
|
||||
& CustomFilters.authorized,
|
||||
)
|
||||
)
|
||||
|
@ -1,87 +0,0 @@
|
||||
# Remove this line before deploying
|
||||
_____REMOVE_THIS_LINE_____=True
|
||||
# REQUIRED CONFIG
|
||||
BOT_TOKEN = "" # Require restart after changing it while bot running
|
||||
OWNER_ID = "" # Require restart after changing it while bot running
|
||||
TELEGRAM_API = "" # Require restart after changing it while bot running
|
||||
TELEGRAM_HASH = "" # Require restart after changing it while bot running
|
||||
# OPTIONAL CONFIG
|
||||
USER_SESSION_STRING = "" # Require restart after changing it while bot running
|
||||
DOWNLOAD_DIR = "/usr/src/app/downloads/" # Require restart after changing it while bot running
|
||||
CMD_SUFFIX = "" # Require restart after changing it while bot running
|
||||
AUTHORIZED_CHATS = "" # Require restart after changing it while bot running
|
||||
SUDO_USERS = "" # Require restart after changing it while bot running
|
||||
DATABASE_URL = ""
|
||||
STATUS_LIMIT = "10"
|
||||
DEFAULT_UPLOAD = "gd"
|
||||
STATUS_UPDATE_INTERVAL = "10"
|
||||
FILELION_API = ""
|
||||
STREAMWISH_API = ""
|
||||
EXTENSION_FILTER = ""
|
||||
INCOMPLETE_TASK_NOTIFIER = "False"
|
||||
YT_DLP_OPTIONS = ""
|
||||
USE_SERVICE_ACCOUNTS = "False"
|
||||
NAME_SUBSTITUTE = ""
|
||||
FFMPEG_CMDS = ""
|
||||
# GDrive Tools
|
||||
GDRIVE_ID = ""
|
||||
IS_TEAM_DRIVE = "False"
|
||||
STOP_DUPLICATE = "False"
|
||||
INDEX_URL = ""
|
||||
# Rclone
|
||||
RCLONE_PATH = ""
|
||||
RCLONE_FLAGS = ""
|
||||
RCLONE_SERVE_URL = ""
|
||||
RCLONE_SERVE_PORT = ""
|
||||
RCLONE_SERVE_USER = ""
|
||||
RCLONE_SERVE_PASS = ""
|
||||
# JDownloader
|
||||
JD_EMAIL = ""
|
||||
JD_PASS = ""
|
||||
# Sabnzbd
|
||||
USENET_SERVERS = "[{'name': 'main', 'host': '', 'port': 563, 'timeout': 60, 'username': '', 'password': '', 'connections': 8, 'ssl': 1, 'ssl_verify': 2, 'ssl_ciphers': '', 'enable': 1, 'required': 0, 'optional': 0, 'retention': 0, 'send_group': 0, 'priority': 0}]"
|
||||
# Update
|
||||
UPSTREAM_REPO = ""
|
||||
UPSTREAM_BRANCH = ""
|
||||
# Leech
|
||||
LEECH_SPLIT_SIZE = ""
|
||||
AS_DOCUMENT = "False"
|
||||
EQUAL_SPLITS = "False"
|
||||
MEDIA_GROUP = "False"
|
||||
USER_TRANSMISSION = "False"
|
||||
MIXED_LEECH = "False"
|
||||
LEECH_FILENAME_PREFIX = ""
|
||||
LEECH_DUMP_CHAT = ""
|
||||
THUMBNAIL_LAYOUT = ""
|
||||
# qBittorrent/Aria2c
|
||||
TORRENT_TIMEOUT = ""
|
||||
BASE_URL = ""
|
||||
BASE_URL_PORT = ""
|
||||
WEB_PINCODE = "False"
|
||||
#Queueing system
|
||||
QUEUE_ALL = ""
|
||||
QUEUE_DOWNLOAD = ""
|
||||
QUEUE_UPLOAD = ""
|
||||
# RSS
|
||||
RSS_DELAY = "900"
|
||||
RSS_CHAT = ""
|
||||
# Torrent Search
|
||||
SEARCH_API_LINK = ""
|
||||
SEARCH_LIMIT = "0"
|
||||
SEARCH_PLUGINS = '["https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/piratebay.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/limetorrents.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/torlock.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/torrentscsv.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/eztv.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/torrentproject.py",
|
||||
"https://raw.githubusercontent.com/MaurizioRicci/qBittorrent_search_engines/master/kickass_torrent.py",
|
||||
"https://raw.githubusercontent.com/MaurizioRicci/qBittorrent_search_engines/master/yts_am.py",
|
||||
"https://raw.githubusercontent.com/MadeOfMagicAndWires/qBit-plugins/master/engines/linuxtracker.py",
|
||||
"https://raw.githubusercontent.com/MadeOfMagicAndWires/qBit-plugins/master/engines/nyaasi.py",
|
||||
"https://raw.githubusercontent.com/LightDestory/qBittorrent-Search-Plugins/master/src/engines/ettv.py",
|
||||
"https://raw.githubusercontent.com/LightDestory/qBittorrent-Search-Plugins/master/src/engines/glotorrents.py",
|
||||
"https://raw.githubusercontent.com/LightDestory/qBittorrent-Search-Plugins/master/src/engines/thepiratebay.py",
|
||||
"https://raw.githubusercontent.com/v1k45/1337x-qBittorrent-search-plugin/master/leetx.py",
|
||||
"https://raw.githubusercontent.com/nindogo/qbtSearchScripts/master/magnetdl.py",
|
||||
"https://raw.githubusercontent.com/msagca/qbittorrent_plugins/main/uniondht.py",
|
||||
"https://raw.githubusercontent.com/khensolomon/leyts/master/yts.py"]'
|
106
config_sample.py
Normal file
106
config_sample.py
Normal file
@ -0,0 +1,106 @@
|
||||
# REQUIRED CONFIG
|
||||
BOT_TOKEN = ""
|
||||
OWNER_ID = 0
|
||||
TELEGRAM_API = 0
|
||||
TELEGRAM_HASH = ""
|
||||
# OPTIONAL CONFIG
|
||||
USER_SESSION_STRING = ""
|
||||
DOWNLOAD_DIR = "/usr/src/app/downloads/"
|
||||
CMD_SUFFIX = ""
|
||||
AUTHORIZED_CHATS = ""
|
||||
SUDO_USERS = ""
|
||||
DATABASE_URL = ""
|
||||
STATUS_LIMIT = 10
|
||||
DEFAULT_UPLOAD = "rc"
|
||||
STATUS_UPDATE_INTERVAL = 15
|
||||
FILELION_API = ""
|
||||
STREAMWISH_API = ""
|
||||
EXTENSION_FILTER = ""
|
||||
INCOMPLETE_TASK_NOTIFIER = False
|
||||
YT_DLP_OPTIONS = ""
|
||||
USE_SERVICE_ACCOUNTS = False
|
||||
NAME_SUBSTITUTE = ""
|
||||
FFMPEG_CMDS = []
|
||||
# GDrive Tools
|
||||
GDRIVE_ID = ""
|
||||
IS_TEAM_DRIVE = False
|
||||
STOP_DUPLICATE = False
|
||||
INDEX_URL = ""
|
||||
# Rclone
|
||||
RCLONE_PATH = ""
|
||||
RCLONE_FLAGS = ""
|
||||
RCLONE_SERVE_URL = ""
|
||||
RCLONE_SERVE_PORT = 0
|
||||
RCLONE_SERVE_USER = ""
|
||||
RCLONE_SERVE_PASS = ""
|
||||
# JDownloader
|
||||
JD_EMAIL = ""
|
||||
JD_PASS = ""
|
||||
# Sabnzbd
|
||||
USENET_SERVERS = [
|
||||
{
|
||||
"name": "main",
|
||||
"host": "",
|
||||
"port": 563,
|
||||
"timeout": 60,
|
||||
"username": "",
|
||||
"password": "",
|
||||
"connections": 8,
|
||||
"ssl": 1,
|
||||
"ssl_verify": 2,
|
||||
"ssl_ciphers": "",
|
||||
"enable": 1,
|
||||
"required": 0,
|
||||
"optional": 0,
|
||||
"retention": 0,
|
||||
"send_group": 0,
|
||||
"priority": 0,
|
||||
}
|
||||
]
|
||||
# Update
|
||||
UPSTREAM_REPO = ""
|
||||
UPSTREAM_BRANCH = "master"
|
||||
# Leech
|
||||
LEECH_SPLIT_SIZE = 0
|
||||
AS_DOCUMENT = False
|
||||
EQUAL_SPLITS = False
|
||||
MEDIA_GROUP = False
|
||||
USER_TRANSMISSION = False
|
||||
MIXED_LEECH = False
|
||||
LEECH_FILENAME_PREFIX = ""
|
||||
LEECH_DUMP_CHAT = ""
|
||||
THUMBNAIL_LAYOUT = ""
|
||||
# qBittorrent/Aria2c
|
||||
TORRENT_TIMEOUT = 0
|
||||
BASE_URL = ""
|
||||
BASE_URL_PORT = 0
|
||||
WEB_PINCODE = False
|
||||
# Queueing system
|
||||
QUEUE_ALL = 0
|
||||
QUEUE_DOWNLOAD = 0
|
||||
QUEUE_UPLOAD = 0
|
||||
# RSS
|
||||
RSS_DELAY = 600
|
||||
RSS_CHAT = ""
|
||||
# Torrent Search
|
||||
SEARCH_API_LINK = ""
|
||||
SEARCH_LIMIT = 0
|
||||
SEARCH_PLUGINS = [
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/piratebay.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/limetorrents.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/torlock.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/torrentscsv.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/eztv.py",
|
||||
"https://raw.githubusercontent.com/qbittorrent/search-plugins/master/nova3/engines/torrentproject.py",
|
||||
"https://raw.githubusercontent.com/MaurizioRicci/qBittorrent_search_engines/master/kickass_torrent.py",
|
||||
"https://raw.githubusercontent.com/MaurizioRicci/qBittorrent_search_engines/master/yts_am.py",
|
||||
"https://raw.githubusercontent.com/MadeOfMagicAndWires/qBit-plugins/master/engines/linuxtracker.py",
|
||||
"https://raw.githubusercontent.com/MadeOfMagicAndWires/qBit-plugins/master/engines/nyaasi.py",
|
||||
"https://raw.githubusercontent.com/LightDestory/qBittorrent-Search-Plugins/master/src/engines/ettv.py",
|
||||
"https://raw.githubusercontent.com/LightDestory/qBittorrent-Search-Plugins/master/src/engines/glotorrents.py",
|
||||
"https://raw.githubusercontent.com/LightDestory/qBittorrent-Search-Plugins/master/src/engines/thepiratebay.py",
|
||||
"https://raw.githubusercontent.com/v1k45/1337x-qBittorrent-search-plugin/master/leetx.py",
|
||||
"https://raw.githubusercontent.com/nindogo/qbtSearchScripts/master/magnetdl.py",
|
||||
"https://raw.githubusercontent.com/msagca/qbittorrent_plugins/main/uniondht.py",
|
||||
"https://raw.githubusercontent.com/khensolomon/leyts/master/yts.py",
|
||||
]
|
@ -22,7 +22,6 @@ pillow
|
||||
psutil
|
||||
pymongo
|
||||
pyrofork
|
||||
python-dotenv
|
||||
python-magic
|
||||
qbittorrent-api
|
||||
requests
|
||||
|
46
update.py
46
update.py
@ -1,5 +1,5 @@
|
||||
from sys import exit
|
||||
from dotenv import load_dotenv, dotenv_values
|
||||
from importlib import import_module
|
||||
from logging import (
|
||||
FileHandler,
|
||||
StreamHandler,
|
||||
@ -10,7 +10,7 @@ from logging import (
|
||||
getLogger,
|
||||
ERROR,
|
||||
)
|
||||
from os import path, environ, remove
|
||||
from os import path, remove
|
||||
from pymongo.mongo_client import MongoClient
|
||||
from pymongo.server_api import ServerApi
|
||||
from subprocess import run as srun
|
||||
@ -30,27 +30,21 @@ basicConfig(
|
||||
level=INFO,
|
||||
)
|
||||
|
||||
load_dotenv("config.env", override=True)
|
||||
settings = import_module("config")
|
||||
config_file = {
|
||||
key: value.strip() if isinstance(value, str) else value
|
||||
for key, value in vars(settings).items()
|
||||
if not key.startswith("__")
|
||||
}
|
||||
|
||||
try:
|
||||
if bool(environ.get("_____REMOVE_THIS_LINE_____")):
|
||||
log_error("The README.md file there to be read! Exiting now!")
|
||||
exit(1)
|
||||
except:
|
||||
pass
|
||||
|
||||
BOT_TOKEN = environ.get("BOT_TOKEN", "")
|
||||
if len(BOT_TOKEN) == 0:
|
||||
BOT_TOKEN = config_file.get("BOT_TOKEN", "")
|
||||
if not BOT_TOKEN:
|
||||
log_error("BOT_TOKEN variable is missing! Exiting now")
|
||||
exit(1)
|
||||
|
||||
BOT_ID = BOT_TOKEN.split(":", 1)[0]
|
||||
|
||||
DATABASE_URL = environ.get("DATABASE_URL", "")
|
||||
if len(DATABASE_URL) == 0:
|
||||
DATABASE_URL = None
|
||||
|
||||
if DATABASE_URL is not None:
|
||||
if DATABASE_URL := config_file.get("DATABASE_URL", "").strip():
|
||||
try:
|
||||
conn = MongoClient(DATABASE_URL, server_api=ServerApi("1"))
|
||||
db = conn.mltb
|
||||
@ -59,25 +53,19 @@ if DATABASE_URL is not None:
|
||||
if old_config is not None:
|
||||
del old_config["_id"]
|
||||
if (
|
||||
old_config is not None
|
||||
and old_config == dict(dotenv_values("config.env"))
|
||||
or old_config is None
|
||||
old_config is not None and old_config == config_file or old_config is None
|
||||
) and config_dict is not None:
|
||||
environ["UPSTREAM_REPO"] = config_dict["UPSTREAM_REPO"]
|
||||
environ["UPSTREAM_BRANCH"] = config_dict["UPSTREAM_BRANCH"]
|
||||
config_file["UPSTREAM_REPO"] = config_dict["UPSTREAM_REPO"]
|
||||
config_file["UPSTREAM_BRANCH"] = config_dict["UPSTREAM_BRANCH"]
|
||||
conn.close()
|
||||
except Exception as e:
|
||||
log_error(f"Database ERROR: {e}")
|
||||
|
||||
UPSTREAM_REPO = environ.get("UPSTREAM_REPO", "")
|
||||
if len(UPSTREAM_REPO) == 0:
|
||||
UPSTREAM_REPO = None
|
||||
UPSTREAM_REPO = config_file.get("UPSTREAM_REPO", "").strip()
|
||||
|
||||
UPSTREAM_BRANCH = environ.get("UPSTREAM_BRANCH", "")
|
||||
if len(UPSTREAM_BRANCH) == 0:
|
||||
UPSTREAM_BRANCH = "master"
|
||||
UPSTREAM_BRANCH = config_file.get("UPSTREAM_BRANCH", "").strip() or "master"
|
||||
|
||||
if UPSTREAM_REPO is not None:
|
||||
if UPSTREAM_REPO:
|
||||
if path.exists(".git"):
|
||||
srun(["rm", "-rf", ".git"])
|
||||
|
||||
|
20
web/nodes.py
20
web/nodes.py
@ -1,12 +1,4 @@
|
||||
from anytree import NodeMixin
|
||||
from os import environ
|
||||
from re import findall as re_findall
|
||||
|
||||
DOWNLOAD_DIR = environ.get("DOWNLOAD_DIR", "")
|
||||
if len(DOWNLOAD_DIR) == 0:
|
||||
DOWNLOAD_DIR = "/usr/src/app/downloads/"
|
||||
elif not DOWNLOAD_DIR.endswith("/"):
|
||||
DOWNLOAD_DIR += "/"
|
||||
|
||||
|
||||
class TorNode(NodeMixin):
|
||||
@ -42,12 +34,12 @@ def qb_get_folders(path):
|
||||
return path.split("/")
|
||||
|
||||
|
||||
def get_folders(path):
|
||||
fs = re_findall(f"{DOWNLOAD_DIR}[0-9]+/(.+)", path)[0]
|
||||
def get_folders(path, root_path):
|
||||
fs = path.split(root_path)[-1]
|
||||
return fs.split("/")
|
||||
|
||||
|
||||
def make_tree(res, tool=False):
|
||||
def make_tree(res, tool=False, root_path=""):
|
||||
if tool == "qbittorrent":
|
||||
parent = TorNode("QBITTORRENT")
|
||||
folder_id = 0
|
||||
@ -93,7 +85,7 @@ def make_tree(res, tool=False):
|
||||
parent = TorNode("ARIA2")
|
||||
folder_id = 0
|
||||
for i in res:
|
||||
folders = get_folders(i["path"])
|
||||
folders = get_folders(i["path"], root_path)
|
||||
priority = 1
|
||||
if i["selected"] == "false":
|
||||
priority = 0
|
||||
@ -115,7 +107,9 @@ def make_tree(res, tool=False):
|
||||
else:
|
||||
previous_node = current_node
|
||||
try:
|
||||
progress = round((int(i["completedLength"]) / int(i["length"])) * 100, 5)
|
||||
progress = round(
|
||||
(int(i["completedLength"]) / int(i["length"])) * 100, 5
|
||||
)
|
||||
except:
|
||||
progress = 0
|
||||
TorNode(
|
||||
|
@ -354,8 +354,8 @@
|
||||
<header>
|
||||
<div class="container mx-auto px-4">
|
||||
<div class="flex flex-col sm:flex-row justify-between items-center">
|
||||
<h1 class="text-3xl font-bold text-white mb-4 sm:mb-0">Torrent
|
||||
file selector</h1>
|
||||
<h1 class="text-3xl font-bold text-white mb-4 sm:mb-0">🗂 Torrent
|
||||
File Selector</h1>
|
||||
<div class="flex items-center space-x-4">
|
||||
<div class="flex items-center">
|
||||
<span id="themeIcon" class="mr-2 text-white">🌙</span>
|
||||
@ -405,7 +405,7 @@
|
||||
<footer>
|
||||
<div class="container mx-auto px-4">
|
||||
<div class="footer-content flex flex-col sm:flex-row justify-between items-center">
|
||||
<p class="text-white mb-4 sm:mb-0">© 2024 Torrent file selector.
|
||||
<p class="text-white mb-4 sm:mb-0">© 2024 Torrent File Selector.
|
||||
All rights reserved.</p>
|
||||
<div class="footer-buttons flex space-x-4">
|
||||
<a href="https://github.com/anasty17/mirror-leech-telegram-bot" target="_blank"
|
||||
@ -811,7 +811,7 @@
|
||||
renderFileTree(currentFolder ? currentFolder.children : files);
|
||||
} else {
|
||||
modalTitle.textContent = 'Error';
|
||||
modalBody.innerHTML = '<p>There was an error submitting your Rename. Please try again.</p>';
|
||||
modalBody.innerHTML = '<p>There was an error submitting your Rename. Try Again!.</p>';
|
||||
}
|
||||
modalFooter.innerHTML = '<button class="btn btn-primary" onclick="closeModal()">Okay</button>';
|
||||
openModal();
|
||||
@ -847,7 +847,7 @@
|
||||
return;
|
||||
}
|
||||
modalTitle.textContent = 'Processing...';
|
||||
modalBody.innerHTML = `<p>Submitting your selection ${selectedCount.innerText} ... </p>`;
|
||||
modalBody.innerHTML = `<p>Submitting, ${selectedCount.innerText} file(s)... </p>`;
|
||||
modalFooter.innerHTML = '';
|
||||
openModal();
|
||||
const requestUrl = `/app/files/torrent?gid=${urlParams.gid}&pin=${pinInput.value}&mode=selection`;
|
||||
@ -860,7 +860,7 @@
|
||||
modalBody.innerHTML = '<p>Your selection has been submitted successfully.</p>';
|
||||
} else {
|
||||
modalTitle.textContent = 'Error';
|
||||
modalBody.innerHTML = '<p>There was an error submitting your selection. Please try again.</p>';
|
||||
modalBody.innerHTML = '<p>An error occurred while submitting your selection. Try Again!</p>';
|
||||
}
|
||||
modalFooter.innerHTML = '<button class="btn btn-primary" onclick="closeModal()">Okay</button>';
|
||||
openModal();
|
||||
@ -895,7 +895,7 @@
|
||||
return response.json().then(data => {
|
||||
if (data.error) {
|
||||
modalTitle.textContent = data.error;
|
||||
modalBody.innerHTML = `<p>${data.message}. Please try again.</p>`;
|
||||
modalBody.innerHTML = `<p>${data.message}. Try Again!</p>`;
|
||||
modalFooter.innerHTML = '<button class="btn btn-primary" onclick="closeModal()">Retry</button>';
|
||||
openModal();
|
||||
} else {
|
||||
@ -913,14 +913,14 @@
|
||||
}
|
||||
});
|
||||
} else {
|
||||
modalTitle.textContent = 'Something Went Wrong';
|
||||
modalBody.innerHTML = '<p>Please check console. Status Code: ' + response.status + '</p>';
|
||||
modalTitle.textContent = 'Something Went Wrong!';
|
||||
modalBody.innerHTML = '<p>Check console. Status Code: ' + response.status + '</p>';
|
||||
modalFooter.innerHTML = '<button class="btn btn-primary" onclick="closeModal()">Retry</button>';
|
||||
openModal();
|
||||
}
|
||||
}).catch(error => {
|
||||
modalTitle.textContent = 'Server Error';
|
||||
modalBody.innerHTML = '<p>There was an error connecting to the server. Please try again.</p><br>' + error.message;
|
||||
modalBody.innerHTML = '<p>There was an error connecting to the server. Try Again!</p><br>' + error.message;
|
||||
modalFooter.innerHTML = '<button class="btn btn-primary" onclick="closeModal()">Retry</button>';
|
||||
openModal();
|
||||
});
|
||||
|
@ -165,7 +165,8 @@ def handle_torrent():
|
||||
content = make_tree(res, "qbittorrent")
|
||||
else:
|
||||
res = aria2.client.get_files(gid)
|
||||
content = make_tree(res, "aria2")
|
||||
fpath = f"{aria2.client.get_option(gid)['dir']}/"
|
||||
content = make_tree(res, "aria2", fpath)
|
||||
except Exception as e:
|
||||
LOGGER.error(str(e))
|
||||
content = {
|
||||
|
Loading…
Reference in New Issue
Block a user