Executing redundant queries repetitively can waste time, to counter this problem we have created APIGen. The main aim of our project is to analyse the existing query logs of a database and generate custom APIs that execute the recommended important queries that have been run often in the database. These APIs once run, will execute the recommended queries onto the database for efficient work.
Install the file "requirements.txt"
- pip install virtualenv
- virtualenv env
- cd env/Scripts/activate
- pip install -r requirements.txt
- python manage.py runserver 8000
- Support for SQL and NoSQL databases: APIGen will access the provided query logs from both SQL and NoSQL databases, pre�process the logs and then plug these queries into our ML model.
- Recommend APIs: Our ML model will analyse the queries and check for the queries which have been used the most along with the tables that have been frequently accessed. After processing it will generate custom APIs for the recommended queries given by the model. These APIs will fire the queries in the connected database whenever needed.
- Query Analytics: Along with the custom generated APIs, APIGen will provide several query analytics such as queries with the least processing time, queries with maximum processing time , mean query processing time. This will help generate insights about the performance of the database and the queries which have been executed.
- Queries taking maximum processing time and the time taken
- Queries taking minimum processing time and the time taken
- Mean time taken by the queries to process
- Number of Queries Submitted over Time
- Correlation between Execution Time And Client's IP Address
- Top 5 Queried Table
- Load on Database by Client's IP Address