I am working on Google colab. And working to download data from ‘imdb_reviews/subwords8k’ dataset. I have the check the tensorFlow version for this session,
2.8.2. I was able to load data for last couple of days but suddenly colad shows the following error behavior. And I cannot process the data. please…have a look…
import tensorflow_datasets as tfds words, infos = tfds.load("imdb_reviews/subwords8k", with_info=True, as_supervised=True) tokenizer = infos.features['text'].encoder
WARNING:absl:TFDS datasets with text encoding are deprecated and will be removed in a future version. Instead, you should use the plain text version and tokenize the text using `tensorflow_text` (See: https://www.tensorflow.org/tutorials/tensorflow_text/intro#tfdata_example) Downloading and preparing dataset imdb_reviews/subwords8k/1.0.0 (download: 80.23 MiB, generated: Unknown size, total: 80.23 MiB) to /root/tensorflow_datasets/imdb_reviews/subwords8k/1.0.0...
The snapshot of the Error:
TimeoutError Traceback (most recent call last) /usr/local/lib/python3.7/dist-packages/urllib3/connection.py in _new_conn(self) 158 conn = connection.create_connection( --> 159 (self._dns_host, self.port), self.timeout, **extra_kw) 160 TimeoutError: [Errno 110] Connection timed out During handling of the above exception, another exception occurred: NewConnectionError Traceback (most recent call last) NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7efd5a671f10>: Failed to establish a new connection: [Errno 110] Connection timed out ConnectionError: HTTPConnectionPool(host='ai.stanford.edu', port=80): Max retries exceeded with url: /~amaas/data/sentiment/aclImdb_v1.tar.gz (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7efd5a671f10>: Failed to establish a new connection: [Errno 110] Connection timed out'))
Please suggests some ideas to resolve the error