1. Finding Dead Links (It will check if page exists or not and also if a domain is expired it will mark as Dead)
2. Removing Duplicate Domains from list
3. Trimming Urls to Last Folder
4. Checking for Certain "Text" in a Page.. (if a have a bulk list of website, which is of some platform, then u can insert that text u want to find for and it will check for that text in that page!)
5. Checking for Certain URL in a Page (Used to check backlinks.. For Ex. if u blast a list, then u can put ur output here and also links that u used in ur blast and check how much is linking u back!)
6. Removing Domains that contain some Text (For Ex. if u have scraped a big list, and u want to remove web 2.0's from it, then u can just load a text file with web 2.0 sites in it and hit on process! it will remove all domains containing that Text..)
7. It will return Anchor Text and Urls also (for Point No 5)
1. Multi Threaded (u can use upto 500 Threads at a time!!)
2. Public Proxies can be used
3. You can Export Alive links, dead links or all Links.
4. Highly Responsive!
5. You can Pause the Process and Resume later..