Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Check 30,000 Domain For Availability in 1 minute

Elite Member
Elite Member

Posts: 27
Threads: 14
Joined: Apr 2014
Reputation: 25

Credit: 42.6 RLP
Check 30,000 Domain For Availability in 1 minute

There are several checkers online that allow you to check over 5,000 domains at once(with most major TLD)
One of them is
Hidden Content:
You must reply to see links
Before I used to simply open 10 tabs and put 50k domains in them but soon I realized it would stop processing domains in some tabs. Probably because of an IP bound limit. 
Then I tried opening 5 browsers with all different IP and I had no problem checking 100k domains in 5 minutes. 

How to open 5 browsers(each with a different IP)?

Quote:1. Install  sandboxie at 
Hidden Content:
You must reply to see links
It allows you to open same application multiple times. 
2. You must have 2-3 browsers. 
I use chrome, firefox and Internet Explorer.
3. Install VPN addons on firefox and chrome.
I use,
Hidden Content:
You must reply to see links
Hidden Content:
You must reply to see links
Internet Explorer: I use HideMyAss VPN.(Can use any VPN service)
4. Launch 2 instances of each browser. (One normally and one with Sandboxie)
5. Activate the VPN addons.

Now open namebright bulk checker in every browser.
Hidden Content:
You must reply to see links
The biggest problem I had in the beginning, was copying 5k domains to my clipboard and pasting them in the browsers for checking. This took a lot of time.
Now I use a simple python script to copy me 5,000 new domains everytime I press "ESC" key.

keywordfile  = open("path_of_domain_file.","r") 
keywords=[line.strip() for line in keywordfile.readlines()]
keywords_parts = [keywords[x:x+5000] for x in range(0, len(keywords), 5000)]
while y < len(keywords_parts):
  while xy==0:
    key = ord(getch())
    if key == 27: #ESC
        print("esc pressed")
  str1 = '\n'.join(str(e) for e in keywords_parts[y])

I know the code is ugly :p
So now I would just go and quickly check all my domains for availibility and import them as csv.

I import all CSV files inside one folder and use "copy *.csv combined.csv" in cmd. This will copy contents of all CSV files into one file named "combined.csv".


Now I have a huge list of available domains. To filter this out I use scrapebox with Page Authority Addon. It allows you to put an infinite number of MOZ API keys. I have about 500 configured but Scrapebox limits maximum connections to 100 by default, though you can bypass it by simply going to "\Addons\ScrapeBox Page Authority\sbpageauthority.addon.ini" and editing the connections.

[Image: H4VXckI.gif]
05-17-2017, 01:29 AM
Find Reply

Forum Jump:

Users browsing this thread: 1 Guest(s)