You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Love the software. Been following it on Twitter for some time.
I am having issues getting my own instances to get up and running. Dependencies are installed. The script will get up and run but it's not discovering dumps.
I am watching the @dumpmon, I see my instance "Checking" those same links but with no hits. I validated that it was over the email thresh-hold from the config.
The text was updated successfully, but these errors were encountered:
I think I had the same problem. Most of the leaks come from Pastebin and I found out that it was returning a "Please refresh the page to continue..." message instead of the raw content.
I'm guessing that the problem was using the same requests.Session() for all the sites since the Slexy headers (with a referer field) were being used in the other sites too.
So, an easy solution is creating a new connection for every request.
But I recommend you to create an instance of requests.Session() in the initialization of the sites so eachone has his own and give these instances when you call the helper.download() function instead of using a global one in helper.
Love the software. Been following it on Twitter for some time.
I am having issues getting my own instances to get up and running. Dependencies are installed. The script will get up and run but it's not discovering dumps.
I am watching the @dumpmon, I see my instance "Checking" those same links but with no hits. I validated that it was over the email thresh-hold from the config.
The text was updated successfully, but these errors were encountered: