Code QA Improvement

 Backup function improvement
🔤 Language improvement
This commit is contained in:
Sam 2022-02-04 20:18:36 +08:00
parent 94be055d2f
commit c3a3794bd0
Signed by: sam01101
GPG Key ID: 42D7B6D13FF5E611
5 changed files with 94 additions and 70 deletions

View File

@ -128,17 +128,18 @@ parseqr_e_noqr: Target is not a QR Code.
parseqr_log: Parsed QR Code with content
# backup
# backup
backup_des: Backup data files to local as well as log channels, support full backup of database, safe and reliable.
backup_process: It may take some time to complete the backup.
backup_success_channel: Backup is complete! And it has been packaged and sent to the Log channel!
backup_success: Backup is complete!
backup_des: Backup data files to local and log channel, support database backup, safe and reliable.
backup_process: Processing backup... It may take some time to complete.
backup_success_channel: Backup complete! Zipped backup has been sent to log channel.
backup_success: Backup complete!
# recovery
recovery_des: Restore data from local backup or replied backup file, and support full database recovery easily and quickly.
recovery_file_error: Unknown file that may not be the type of file backed up by pagermaid.
recovery_down: Downloading backup file to local...
recovery_process: It may take some time to complete the recovery.
recovery_file_error: Unknown backup file. It may not be the type of file backed up by PagerMaid.
recovery_down: 'Downloading backup file to local...
It is recommended to reduce the use of PagerMaid in the backup process.'
recovery_process: Restoring backup... It may take some time to complete.
recovery_file_not_found: No backup file found.
recovery_success: Data file recovery is complete! You may need to restart manually for this to take effect.
recovery_success: Backup restored!
# captions
# convert
convert_des: Reply to an attachment message and convert it to image output
@ -510,7 +511,7 @@ update_change_log: update log
update_log_too_big: The update log is too long and files are being attached.
update_hint: Please use the following command to update
update_found_pulling: An update was found and is being pulled...
update_success: Update successfully
update_success: Update successfully.
update_failed: update failed
update_auto_upgrade_git_failed_ubuntu: It is detected that your system is Ubuntu or Debain. Try to upgrade git automatically but failed. Please upgrade manually.
update_auto_upgrade_git_failed_cent: It is detected that your system is CentOS, try to upgrade git automatically but failed, please upgrade manually.

View File

@ -136,16 +136,16 @@ parseqr_log: 已解析一张带有 QR 码的消息,内容:
# backup
## backup
backup_des: 备份数据文件到本地以及日志频道,支持数据库的全量备份,安全可靠。
backup_process: 开始备份,可能需要一定的时间。。。
backup_success_channel: 数据文件备份完成打包发送到 Log 频道!
backup_process: 备份中,请耐心等待备份程序完成...
backup_success_channel: 数据文件备份完成并已打包发送到 Log 频道!
backup_success: 数据文件备份完成!
## recovery
recovery_des: 从本地备份或者所回复的备份文件中恢复数据,支持数据库的全量恢复,方便快速。
recovery_file_error: 未知的文件,可能并不是通过 pagermaid 所备份的文件类型。
recovery_down: 正在下载备份文件到本地。。。
recovery_file_error: 未知的文件,可能并不是通过 PagerMaid 所备份的文件类型。
recovery_down: 正在下载备份文件到本地... 在备份的过程中,建议减少使用 PagerMaid
recovery_process: 开始恢复,可能需要一定的时间。。
recovery_file_not_found: 没有找到备份文件。
recovery_success: 数据文件恢复完成!您可能需要手动重新启动才能生效。
recovery_file_not_found: 可用的备份文件。
recovery_success: 備份恢复完成!
# captions
## convert
@ -527,7 +527,7 @@ update_change_log: 更新日志
update_log_too_big: 更新日志太长,正在附加文件。
update_hint: 请使用以下命令进行更新
update_found_pulling: 找到更新,正在拉取 . . .
update_success: 更新成功
update_success: 更新成功
update_failed: 更新失败
update_auto_upgrade_git_failed_ubuntu: 检测到您的系统是Ubuntu或Debain尝试自动升级git但失败,请手动升级.
update_auto_upgrade_git_failed_cent: 检测到您的系统是CentOS尝试自动升级git但失败,请手动升级.

View File

@ -128,17 +128,17 @@ parseqr_e_noqr: 回覆的附件錯誤!
parseqr_log: 已解析QR 扣的內容,如下:
# backup
# backup
backup_des: 備份數據文件到本地以及日誌頻道,支持數據庫的全量備份,安全可靠。
backup_process: 開始備份,可能需要一定的時間。。。
backup_success_channel: 數據文件備份完成打包發送到 Log 頻道!
backup_des: 備份數據文件到本地以及日誌頻道,支援數據庫備份,安全可靠。
backup_process: 備份中,請耐心等待備份程序完成...
backup_success_channel: 數據文件備份完成並已打包發送到 Log 頻道!
backup_success: 數據文件備份完成!
# recovery
recovery_des: 從本地備份或者所回复的備份文件中恢復數據,支持數據庫的全量恢復,方便快速。
recovery_file_error: 未知的文件,可能並不是通過 pagermaid 所備份的文件類型。
recovery_down: 正在下載備份文件到本地。。。
recovery_file_error: 未知的文件,可能並不是通過 PagerMaid 所備份的文件類型。
recovery_down: 正在下載備份文件到本地... 備份的過程中,建議減少使用 PagerMaid
recovery_process: 開始恢復,可能需要一定的時間。。
recovery_file_not_found: 沒有找到備份文件。
recovery_success: 數據文件恢復完成!您可能需要手動重新啟動才能生效。
recovery_file_not_found: 到備份文件。
recovery_success: 備份恢復完成!
# captions
# convert
convert_des: 回覆附件訊息並轉換為圖片
@ -306,7 +306,7 @@ apt_processing: 安裝插件中…
apt_no_py: Error無法獲取插件文件。
apt_plugin: 插件
apt_installed: 已安裝
apt_reboot: 正在重新啟動
apt_reboot: 正在重新啟動 PagerMaid-Modify
apt_install_success: 安裝成功
apt_not_found: Error沒有找到插件
apt_install_failed: 安裝失敗
@ -475,7 +475,7 @@ eval_need_dev: '**請注意:此命令可以直接操作您的賬戶**
# restart
restart_des: 重新啟動
restart_processing: 正在嘗試重新啟動
restart_complete: 重新啟動完成
restart_complete: PagerMaid-Modify 重新啟動完成
restart_log: 重新啟動 PagerMaid-Modify
# trace
trace_des: 跟蹤URL的重新導向。
@ -510,7 +510,7 @@ update_change_log: 更新日誌
update_log_too_big: 更新日誌太長,正在產生文件
update_hint: 請使用以下命令更新
update_found_pulling: 找到更新,正在獲取
update_success: 更新成功
update_success: 更新成功
update_failed: 更新失敗
update_auto_upgrade_git_failed_ubuntu: 檢測到您的系統為Ubuntu/Debian嘗試自動升級Git失敗請手動。
update_auto_upgrade_git_failed_cent: 檢測到您的系統為CentOS/RedHat Linux嘗試自動升級Git失敗請手動。

View File

@ -176,14 +176,8 @@ redis_host = config.get('redis').get('host', 'localhost')
redis_port = config.get('redis').get('port', 6379)
redis_db = config.get('redis').get('db', 14)
redis_password = config.get('redis').get('password', '')
if strtobool(config.get('ipv6', 'False')):
use_ipv6 = True
else:
use_ipv6 = False
if strtobool(config.get('silent', 'True')):
silent = True
else:
silent = False
use_ipv6 = bool(strtobool(config.get('ipv6', 'False')))
silent = bool(strtobool(config.get('silent', 'True')))
if api_key is None or api_hash is None:
logs.info(
lang('config_error')

View File

@ -4,11 +4,15 @@ import os
import tarfile
from distutils.util import strtobool
from io import BytesIO
from traceback import format_exc
from pagermaid import config, redis_status, redis, silent
from telethon.tl.types import MessageMediaDocument
from pagermaid import config, redis_status, redis
from pagermaid.listener import listener
from pagermaid.utils import alias_command, upload_attachment, lang
from telethon.tl.types import MessageMediaDocument
pgm_backup_zip_name = "pagermaid_backup.tar.gz"
def make_tar_gz(output_filename, source_dirs: list):
@ -35,22 +39,25 @@ def un_tar_gz(filename, dirs):
t.extractall(path=dirs)
return True
except Exception as e:
print(e)
print(e, format_exc())
return False
@listener(is_plugin=True, outgoing=True, command=alias_command("backup"),
@listener(is_plugin=True, outgoing=True, owners_only=True, command=alias_command("backup"),
description=lang('back_des'))
async def backup(context):
if not silent:
await context.edit(lang('backup_process'))
if os.path.exists("pagermaid_backup.tar.gz"):
os.remove("pagermaid_backup.tar.gz")
await context.edit(lang('backup_process'))
# Remove old backup
if os.path.exists(pgm_backup_zip_name):
os.remove(pgm_backup_zip_name)
# remove mp3 , they are so big !
for i in os.listdir("data"):
if i.find(".mp3") != -1 or i.find(".jpg") != -1 or i.find(".flac") != -1 or i.find(".ogg") != -1:
os.remove(f"data{os.sep}{i}")
# backup redis
# backup redis when available
redis_data = {}
if redis_status():
for k in redis.keys():
@ -58,47 +65,69 @@ async def backup(context):
if data_type == b'string':
v = redis.get(k)
redis_data[k.decode()] = v.decode()
with open(f"data{os.sep}redis.json", "w", encoding='utf-8') as f:
json.dump(redis_data, f, indent=4)
with open(f"data{os.sep}redis.json", "w", encoding='utf-8') as f:
json.dump(redis_data, f, indent=4)
# run backup function
make_tar_gz("pagermaid_backup.tar.gz", ["data", "plugins", "config.yml"])
make_tar_gz(pgm_backup_zip_name, ["data", "plugins", "config.yml"])
if strtobool(config['log']):
await upload_attachment("pagermaid_backup.tar.gz", int(config['log_chatid']), None)
await upload_attachment(pgm_backup_zip_name, int(config['log_chatid']), None)
await context.edit(lang("backup_success_channel"))
else:
await context.edit(lang("backup_success"))
@listener(is_plugin=True, outgoing=True, command=alias_command("recovery"),
@listener(is_plugin=True, outgoing=True, owners_only=True, command=alias_command("recovery"),
description=lang('recovery_des'))
async def recovery(context):
message = await context.get_reply_message()
if message and message.media:
if message and message.media: # Overwrite local backup
if isinstance(message.media, MessageMediaDocument):
try:
file_name = message.media.document.attributes[0].file_name
except:
return await context.edit(lang('recovery_file_error'))
if file_name.find(".tar.gz") != -1:
await context.edit(lang('recovery_down'))
else:
if message.media.document.attributes[0].file_name.find(".tar.gz") != -1: # Verify filename
await context.edit(lang('recovery_down'))
# Start download process
_file = BytesIO()
await context.client.download_file(message.media.document, _file)
with open(pgm_backup_zip_name, "wb") as f:
f.write(_file.getvalue())
else:
return await context.edit(lang('recovery_file_error'))
except Exception as e: # noqa
print(e, format_exc())
return await context.edit(lang('recovery_file_error'))
else:
return await context.edit(lang('recovery_file_error'))
_file = BytesIO()
await context.client.download_file(message.media.document, _file)
with open("pagermaid_backup.tar.gz", "wb") as f:
f.write(_file.getvalue())
if not silent:
await context.edit(lang('recovery_process'))
if not os.path.exists("pagermaid_backup.tar.gz"):
# Extract backup files
await context.edit(lang('recovery_process'))
if not os.path.exists(pgm_backup_zip_name):
return await context.edit(lang('recovery_file_not_found'))
un_tar_gz("pagermaid_backup.tar.gz", "")
# recovery redis
if redis_status():
if os.path.exists(f"data{os.sep}redis.json"):
with open(f"data{os.sep}redis.json", "r", encoding='utf-8') as f:
elif not un_tar_gz(pgm_backup_zip_name, ""):
os.remove(pgm_backup_zip_name)
return await context.edit(lang('recovery_file_error'))
# Recovery redis
if redis_status() and os.path.exists(f"data{os.sep}redis.json"):
with open(f"data{os.sep}redis.json", "r", encoding='utf-8') as f:
try:
redis_data = json.load(f)
for k, v in redis_data.items():
redis.set(k, v)
await context.edit(lang('recovery_success'))
for k, v in redis_data.items():
redis.set(k, v)
except json.JSONDecodeError:
"""JSON load failed, skip redis recovery"""
except Exception as e: # noqa
print(e, format_exc())
# Cleanup
if os.path.exists(pgm_backup_zip_name):
os.remove(pgm_backup_zip_name)
if os.path.exists(f"data{os.sep}redis.json"):
os.remove(f"data{os.sep}redis.json")
result = await context.edit(lang('recovery_success') + " " + lang('apt_reboot'))
if redis_status():
redis.set("restart_edit", f"{result.id}|{result.chat_id}")
await context.client.disconnect()