Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

YFTzMissingError('$%ticker%: possibly delisted; No timezone found') when downloading price data #1996

Open
mattpabi opened this issue Jul 20, 2024 · 14 comments

Comments

@mattpabi
Copy link

mattpabi commented Jul 20, 2024

Describe bug

Hi, this was an issue I have not seen anybody raise: YFTzMissingError. I have been using yfinance for about 2 years now without ever encountering this issue too.

When using yf.download(ticker), a timezone not found error is raised, and it says that the request timed out. It's weird as the code seems to randomly work (without any intervention) every few minutes.

  • It is also worth noting that for some reason, the code's success rate is drastically improved when run in a Jupyter Notebook cell, rather than a .py script.

Thanks in advance.

Simple code that reproduces your problem

yf.enable_debug_mode()  # show debug log
df = yf.download("AAPL")  # <-- suspect


### OUTPUT:

Failed to get ticker 'AAPL' reason: HTTPSConnectionPool(host='query1.finance.yahoo.com', port=443): Read timed out. (read timeout=30)
[*********************100%%**********************]  1 of 1 completed

1 Failed download:
['AAPL']: YFTzMissingError('$%ticker%: possibly delisted; No timezone found')

Debug log

DEBUG    Entering download()
DEBUG     Disabling multithreading because DEBUG logging enabled
DEBUG     Entering history()
DEBUG      Entering history()
DEBUG       AAPL: Yahoo GET parameters: {'period1': '1925-08-13 20:00:00-04:00', 'period2': '2024-07-20 07:28:56-04:00', 'interval': '1d', 'includePrePost': False, 'events': 'div,splits,capitalGains'}
DEBUG       Entering get()
DEBUG        url=https://query2.finance.yahoo.com/v8/finance/chart/AAPL
DEBUG        params={'period1': -1400630400, 'period2': 1721474936, 'interval': '1d', 'includePrePost': False, 'events': 'div,splits,capitalGains'}
DEBUG        Entering _get_cookie_and_crumb()
DEBUG         cookie_mode = 'basic'
DEBUG         Entering _get_cookie_and_crumb_basic()
DEBUG          reusing cookie
DEBUG          reusing crumb
DEBUG         Exiting _get_cookie_and_crumb_basic()
DEBUG        Exiting _get_cookie_and_crumb()
ERROR     
          1 Failed download:
ERROR     ['AAPL']: ReadTimeout(ReadTimeoutError("HTTPSConnectionPool(host='query2.finance.yahoo.com', port=443): Read timed out. (read timeout=10)"))
DEBUG     ['AAPL']: Traceback (most recent call last):
            File "/usr/local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 466, in _make_request
              self._validate_conn(conn)
            File "/usr/local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 1095, in _validate_conn
              conn.connect()
            File "/usr/local/lib/python3.12/site-packages/urllib3/connection.py", line 652, in connect
              sock_and_verified = _ssl_wrap_socket_and_match_hostname(
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/urllib3/connection.py", line 805, in _ssl_wrap_socket_and_match_hostname
              ssl_sock = ssl_wrap_socket(
                         ^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/urllib3/util/ssl_.py", line 465, in ssl_wrap_socket
              ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/urllib3/util/ssl_.py", line 509, in _ssl_wrap_socket_impl
              return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/ssl.py", line 455, in wrap_socket
              return self.sslsocket_class._create(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/ssl.py", line 1042, in _create
              self.do_handshake()
            File "/usr/local/lib/python3.12/ssl.py", line 1320, in do_handshake
              self._sslobj.do_handshake()
          TimeoutError: _ssl.c:983: The handshake operation timed out
          
          The above exception was the direct cause of the following exception:
          
          Traceback (most recent call last):
            File "/usr/local/lib/python3.12/site-packages/requests/adapters.py", line 667, in send
              resp = conn.urlopen(
                     ^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 843, in urlopen
              retries = retries.increment(
                        ^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/urllib3/util/retry.py", line 474, in increment
              raise reraise(type(error), error, _stacktrace)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/urllib3/util/util.py", line 39, in reraise
              raise value
            File "/usr/local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 789, in urlopen
              response = self._make_request(
                         ^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 490, in _make_request
              raise new_e
            File "/usr/local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 468, in _make_request
              self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            File "/usr/local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 369, in _raise_timeout
              raise ReadTimeoutError(
          urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='query2.finance.yahoo.com', port=443): Read timed out. (read timeout=10)
          
          During handling of the above exception, another exception occurred:
          
          Traceback (most recent call last):
            File "/usr/local/lib/python3.12/site-packages/yfinance/multi.py", line 268, in _download_one
              data = Ticker(ticker).history(
                     ^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/yfinance/utils.py", line 104, in wrapper
              result = func(*args, **kwargs)
                       ^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/yfinance/base.py", line 78, in history
              return self._lazy_load_price_history().history(*args, **kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/yfinance/utils.py", line 104, in wrapper
              result = func(*args, **kwargs)
                       ^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/yfinance/scrapers/history.py", line 140, in history
              # Date range in past so safe to fetch through cache:
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/yfinance/utils.py", line 104, in wrapper
              result = func(*args, **kwargs)
                       ^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/yfinance/data.py", line 366, in get
              response = self._session.get(**request_args)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/requests_cache/session.py", line 127, in get
              return self.request('GET', url, params=params, **kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/requests_cache/session.py", line 183, in request
              return super().request(method, url, *args, headers=headers, **kwargs)  # type: ignore
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
              resp = self.send(prep, **send_kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/requests_cache/session.py", line 230, in send
              response = self._send_and_cache(request, actions, cached_response, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/requests_cache/session.py", line 254, in _send_and_cache
              response = super().send(request, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
              r = adapter.send(request, **kwargs)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "/usr/local/lib/python3.12/site-packages/requests/adapters.py", line 713, in send
              raise ReadTimeout(e, request=request)
          requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='query2.finance.yahoo.com', port=443): Read timed out. (read timeout=10)
          
DEBUG    Exiting download()

Bad data proof

No response

yfinance version

0.2.41

Python version

3.12.4

Operating system

No response

@ValueRaider
Copy link
Collaborator

Each command-line Python re-run will fetch a new crumb to initialise the new data singleton. But Jupyter persists Python state so no re-fetch happens? That could explain random difference.

@mattpabi
Copy link
Author

Each command-line Python re-run will fetch a new crumb to initialise the new data singleton. But Jupyter persists Python state so no re-fetch happens? That could explain random difference.

Thanks, I see. Any way to solve this issue though? It's quite troublesome to scrape on-demand with this error.

Especially since I can't find anyone who has had this YFtz error with a fix

@TheSardOz
Copy link

TheSardOz commented Jul 21, 2024

I have been having the same issue. The data return randomly, I have also been using just the endpoint daily for the last 3 years and same thing there. This has been happening for about one month now.
And same as you, when I run my code on Jupiter note book the data is more consistent, but still takes quite a bit longer than it used to.
If you play with Yahoo Finance itself (pulling up a ticket and changing timeframes) the data is inconsistent there too.

@quanxiang-liu
Copy link

the same to me

@fabiodm7
Copy link

same issue

1 similar comment
@martinbouhier
Copy link

same issue

@ValueRaider
Copy link
Collaborator

Spamming "same" not helping, use the reaction feature.

https://github.com/ranaroussi/yfinance#developers-want-to-contribute

@ValueRaider ValueRaider pinned this issue Jul 24, 2024
@mattpabi
Copy link
Author

Can we assume this is an issue from Yahoo's API, and not the yfinance package?

@oxhaowang
Copy link

Experienced the same problem with version 0.2.41 (with python 3.12.2) , and for the same code somehowever I got a '0.2.28' version in a different environment(with python 3.12.4) working perfectly well....

@ValueRaider
Copy link
Collaborator

ValueRaider commented Aug 15, 2024

Only 1 request change since 0.2.28: adding cookie & crumb in 0.2.32, 9 months ago. Cookie is cached, but cache might not work in Jupyter so cookie fresh in each notebook.

@oxhaowang
Copy link

After some heavy testing, it appears Yahoo has blocked some of my requests.

On the failed machine (a Linux server) it got the 403 Forbidden Error, whilst the working machine(a Mac) also using wget method would complain Openssl error instead.

@JIeJaitt
Copy link

same problem

@fdriessen
Copy link

I had the same issue. Traced the issue back to my ad blocker. After I whitelisted fc.yahoo.com it all worked as expected.

@fabiodm7
Copy link

I was using Pi-hole, and this domain was blocked. I requested that the maintainer remove it from the block list, and it worked fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants