Skip to content

Commit

Permalink
docs
Browse files Browse the repository at this point in the history
  • Loading branch information
vdusek committed Sep 24, 2024
1 parent 49243db commit 2c544b3
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion docs/examples/code/fill_and_submit_web_form_crawler.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ async def request_handler(context: HttpCrawlingContext) -> None:
request = Request.from_url(
url='https://httpbin.org/post',
method='POST',
data={
payload={
'custname': 'John Doe',
'custtel': '1234567890',
'custemail': '[email protected]',
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/code/fill_and_submit_web_form_request.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
request = Request.from_url(
url='https://httpbin.org/post',
method='POST',
data={
payload={
'custname': 'John Doe',
'custtel': '1234567890',
'custemail': '[email protected]',
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/fill_and_submit_web_form.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Now, let's create a POST request with the form fields and their values using the
{RequestExample}
</CodeBlock>

Alternatively, you can send form data as URL parameters using the `query_params` argument. It depends on the form and how it is implemented. However, sending the data as a POST request body using the `data` parameter is generally a better approach.
Alternatively, you can send form data as URL parameters using the `query_params` argument. It depends on the form and how it is implemented. However, sending the data as a POST request body using the `payload` is generally a better approach.

## Implementing the crawler

Expand Down

0 comments on commit 2c544b3

Please sign in to comment.