Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ubuntu - Not Discovering Dumps #11

Open
dray0n opened this issue Mar 16, 2015 · 1 comment
Open

Ubuntu - Not Discovering Dumps #11

dray0n opened this issue Mar 16, 2015 · 1 comment

Comments

@dray0n
Copy link

dray0n commented Mar 16, 2015

Love the software. Been following it on Twitter for some time.

I am having issues getting my own instances to get up and running. Dependencies are installed. The script will get up and run but it's not discovering dumps.

I am watching the @dumpmon, I see my instance "Checking" those same links but with no hits. I validated that it was over the email thresh-hold from the config.

@AsifAFK
Copy link

AsifAFK commented Mar 25, 2015

I think I had the same problem. Most of the leaks come from Pastebin and I found out that it was returning a "Please refresh the page to continue..." message instead of the raw content.

I'm guessing that the problem was using the same requests.Session() for all the sites since the Slexy headers (with a referer field) were being used in the other sites too.

So, an easy solution is creating a new connection for every request.

But I recommend you to create an instance of requests.Session() in the initialization of the sites so eachone has his own and give these instances when you call the helper.download() function instead of using a global one in helper.

If I'm wrong, please correct me, Jordan.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants