Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Import/Export backups of collections in the database #3252

Closed
wants to merge 3 commits into from
Closed

Import/Export backups of collections in the database #3252

wants to merge 3 commits into from

Conversation

ghost
Copy link

@ghost ghost commented Mar 12, 2023

Closes #3248

I dont feel the need to have import exports to a git based provider so this PR only works with JSON files sent via discord.

[p]export <collection name>

[p]import <attach the file>

Comment on lines +762 to +783
async def export_backups(self, collection_name: str):
coll = self.db[collection_name]
documents = []
async for document in coll.find():
documents.append(document)
with open(f"{collection_name}.json", "w") as f:
json.dump(documents, f, cls=CustomJSONEncoder)
with open(f"{collection_name}.json", "rb") as f:
file = discord.File(f, f"{collection_name}.json")
success_message = f"Exported {len(documents)} documents from {collection_name} to JSON. Check your DMs for the file."
return success_message, file

async def import_backups(self, collection_name: str, file: discord.File):
contents = await self.bot.loop.run_in_executor(None, file.fp.read)
documents = json.loads(contents.decode("utf-8"))
coll = self.db[collection_name]
await coll.delete_many({})
result = await coll.insert_many(documents)
success_message = (
f"Imported {len(result.inserted_ids)} documents from {file.filename} into {collection_name}."
)
return success_message
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of JSON, MongoDB stores data using BSON. Perhaps we can do something similar to this gist?

This way, some other native MongoDB types can be saved, and you wouldn't need the custom JSON encoder.

"""
Export a backup of a collection in the form of a json file.

{prefix}export <collection_name>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The two usage examples, on L2196 and L2210, are unnecessary, since discord.py auto generates the usage spec. However, it might be useful to note in the import command's help doc that you should attach the backup file to the message.

@checks.has_permissions(PermissionLevel.ADMINISTRATOR)
async def export_backup(self, ctx, collection_name):
"""
Export a backup of a collection in the form of a json file.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As noted in my other comment, we should store the backup as BSON file instead of JSON. Please change the references to BSON files instead.

@@ -2186,6 +2187,56 @@ async def isenable(self, ctx):

return await ctx.send(embed=embed)

@commands.command(name="export")
@checks.has_permissions(PermissionLevel.ADMINISTRATOR)
async def export_backup(self, ctx, collection_name):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of exporting a collection, most users doesn't know what even are collections. Could you change it so that it exports everything (into one file) and imports them as well?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You mean export config/logs and any plugins into 1 bson file? The issue with that would be the bot wont be able to send the large files.

documents = []
async for document in coll.find():
documents.append(document)
with open(f"{collection_name}.json", "w") as f:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't write to disk, that may lead to potential issues. Use a BytesIO instead.

@ghost
Copy link
Author

ghost commented Apr 22, 2024

Unable to resolve memory spike issues. If someone else is able to figure out a solution please make a new PR.

@ghost ghost closed this Apr 22, 2024
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant