Skip to content

Commit

Permalink
Reformat and use numpy docstyle for robots.py
Browse files Browse the repository at this point in the history
  • Loading branch information
eliasdabbas committed Apr 2, 2024
1 parent 05a9d77 commit f6b052c
Showing 1 changed file with 17 additions and 5 deletions.
22 changes: 17 additions & 5 deletions advertools/robotstxt.py
Original file line number Diff line number Diff line change
Expand Up @@ -467,6 +467,23 @@ def robotstxt_test(robotstxt_url, user_agents, urls):
All the combinations of :attr:`user_agents` and :attr:`urls` will be
checked and the results returned in one DataFrame.
Parameters
----------
robotstxt_url : str
The URL of robotx.txt file.
user_agents : str, list
One or more user agents.
urls : str, list
One or more paths (relative) or URLs (absolute) to check.
Returns
-------
robotstxt_test_df : pandas.DataFrame
A DataFrame with the test results per user-agent/rule combination.
Examples
--------
>>> robotstxt_test(
... "https://facebook.com/robots.txt",
... user_agents=["*", "Googlebot", "Applebot"],
Expand All @@ -486,11 +503,6 @@ def robotstxt_test(robotstxt_url, user_agents, urls):
10 https://facebook.com/robots.txt Googlebot /groups True
11 https://facebook.com/robots.txt Googlebot /hashtag/ False
:param url robotstxt_url: The URL of robotx.txt file
:param str,list user_agents: One or more user agents
:param str,list urls: One or more paths (relative) or URLs (absolute) to
check
:return DataFrame robotstxt_test_df:
"""
if not robotstxt_url.endswith("/robots.txt"):
raise ValueError("Please make sure you enter a valid robots.txt URL")
Expand Down

0 comments on commit f6b052c

Please sign in to comment.