Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wget crashes when trying to downloading db #26

Open
mkandziora opened this issue Jan 26, 2019 · 9 comments
Open

wget crashes when trying to downloading db #26

mkandziora opened this issue Jan 26, 2019 · 9 comments

Comments

@mkandziora
Copy link

Hi,

I am trying to download a pre-built database, but wget crashes regularly, same with curl. Not sure if this is because something goes wrong on my site or yours, I have tried different internet connections, but it's the same. Do you have any advice on how to download the pre-built databases?

wget output:
wget -c -v "http://141.211.236.35:10998/pln.05082018.db"
--2019-01-25 16:16:11-- http://141.211.236.35:10998/pln.05082018.db
Connecting to 141.211.236.35:10998... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 22727467008 (21G), 10828490310 (10G) remaining [application/octet-stream]
Saving to: ‘pln.05082018.db’

pln.05082018.db 52%[+++++++++++++++++++++++++++ ] 11.09G 844KB/s in 16s

2019-01-25 16:16:27 (411 KB/s) - Connection closed at byte 11905569730. Retrying.

--2019-01-25 16:16:28-- (try: 2) http://141.211.236.35:10998/pln.05082018.db

Curl output:
curl -v -o plant.db http://141.211.236.35:10998/pln.05082018.db

  • Trying 141.211.236.35...
    % Total % Received % Xferd Average Speed Time Time Time Current
    Dload Upload Total Spent Left Speed
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Connected to 141.211.236.35 (141.211.236.35) port 10998 (#0)

GET /pln.05082018.db HTTP/1.1
Host: 141.211.236.35:10998
User-Agent: curl/7.47.0
Accept: /

< HTTP/1.1 200 OK
< Accept-Ranges: bytes
< Content-Length: 22727467008
< Content-Type: application/octet-stream
< Last-Modified: Tue, 08 May 2018 18:38:14 GMT
< Date: Sat, 26 Jan 2019 00:11:38 GMT
<
{ [1167 bytes data]
0 21.1G 0 6319k 0 0 434k 0 14:11:07 0:00:14 14:10:53 689k* transfer closed with 22719634945 bytes remaining to read
0 21.1G 0 7648k 0 0 496k 0 12:25:21 0:00:15 12:25:06 887k

  • Closing connection 0
    curl: (18) transfer closed with 22719634945 bytes remaining to read
@Cactusolo
Copy link

Hi @blubbundbla,

I have come across similar issue as you described. But I have usedcurl -C option to rescue/continued a previous file transfer at the given offset. It worked for me.

Because the file itself is large, so you have to "-C" a couple time until it's done.

Note: please man curl at your running environment to make sure the function has this feature.

Good luck!

Miao

@mkandziora
Copy link
Author

Hi,
thank you for the advice. I tried using the -c option, but the connection crashes after a few seconds. I literally would need to redo the command 200 times.
I will probably try the phlawd_db_maker. I just wanted to let you know that the download of the pre-made db is not really working.
Martha

@Cactusolo
Copy link

I’m sorry it’s not working on your side. Anyway, I’m trying all my experiences to help you.

Maybe it’s some regulation on your institution which not encourage people to download huge files.

Just make sure it’s “-C” (cap C), and It would took me 4-5 times in my case. Also I think “curl” also has a “--retry” feature.

Other way, maybe running a bash script, keep trying with combination of "-C" and “--retry” options until it's complete :)

I have tried “phlawd_db_maker”, see here.

@blackrim
Copy link
Member

blackrim commented Feb 2, 2019

Hi folks,
Is it unable to continue the download?

@Cactusolo
Copy link

@blackrim It works on my side previously. Have not tired recently.

Miao

@mkandziora
Copy link
Author

@blackrim I tried it on different machines (all linux) and with different internet connections, as I thought it might be a problem on my site. It crashes approximately every 15sec. It continues the download with the -c function, but only 20x, after that, it stops, due to too many re-tries. When I restart it then it still continues, but it just makes the whole download unfeasible with 20GB.

@blackrim
Copy link
Member

blackrim commented Feb 3, 2019 via email

@mkandziora
Copy link
Author

Yes, I was able to construct one using phlawd_db_maker.

@blackrim
Copy link
Member

blackrim commented Feb 4, 2019 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants