You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some work on #3 and #4 . Created parser abstraction and now each web_url is a separate class extending UrlParser. This should simplify things a lot when adding new proxy providers. Created custom Parser exception as a first step towards custom exception handling.
Copy file name to clipboardExpand all lines: README.md
+8-1Lines changed: 8 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,7 +21,7 @@ Surprisingly, the only thing that tells a server the application triggered the r
21
21
22
22
## The source code
23
23
24
-
The project code in this repository is crawling three different public proxy websites:
24
+
The project code in this repository is crawling **four** different public proxy websites:
25
25
*http://proxyfor.eu/geo.php
26
26
*http://free-proxy-list.net
27
27
*http://rebro.weebly.com/proxy-list.html
@@ -31,3 +31,10 @@ After collecting the proxy data and filtering the slowest ones it is randomly se
31
31
The request timeout is configured at 30 seconds and if the proxy fails to return a response it is deleted from the application proxy list.
32
32
I have to mention that for each request a different agent header is used. The different headers are stored in the **/data/user_agents.txt** file which contains around 900 different agents.
33
33
34
+
## Contributing
35
+
36
+
Contributions are always welcome! Feel free to send a pull request.
37
+
38
+
## Faced an issue?
39
+
40
+
Open an issue[here](https://github.com/pgaref/HTTP_Request_Randomizer/issues), and be as detailed as possible :)
0 commit comments