Importing text files in Google Colab from Github

I am trying to load and preprocess text in Google Colab using Tensorflow and I seem to be having problems importing my text files.

Based on Example 2 of the tutorial, I ran

FILE_NAMES = ['file1.txt', 'file2.txt', 'file3.txt', 'file4.txt']

for name in FILE_NAMES:
  text_dir = utils.get_file(name, origin=DIRECTORY_URL + name)

parent_dir = pathlib.Path(text_dir).parent

However, I’m getting these odd results

Sentence:  b'            name-with-owner="YWtlNzAwL1B5dGhvbg=="'
Label: 3
Sentence:  b'        </tr>'
Label: 0
Sentence:  b'        </tr>'
Label: 0

Am I correct in assuming that Tf is reading in the raw text files from GitHub? I went and tried the first example in the above tutorial by uploading a zipped folder of the text files and trying to unzip via Tf in Colab, then reading the files.

data_url = 'GithubUrlWithZip.7z'

dataset_dir = utils.get_file(

dataset_dir = pathlib.Path(dataset_dir).parent
text_dir = dataset_dir/"datasets"

but the text files did not read well when I checked.

sample_file = text_dir/"file1.txt"

with open(sample_file) as f:

The files reads as such

<!DOCTYPE html>
<html lang="en" data-color-mode="auto" data-light-theme="light" data-dark-theme="dark" data-a11y-animated-images="system">
    <meta charset="utf-8">
  <link rel="dns-prefetch" href="">
  <link rel="dns-prefetch" href="">
  <link rel="dns-prefetch" href="">
  <link rel="dns-prefetch" href="">
  <link rel="preconnect" href="" crossorigin>
  <link rel="preconnect" href="">

  <link crossorigin="anonymous" media="all" integrity="sha512-ksfTgQOOnE+FFXf+yNfVjKSlEckJAdufFIYGK7ZjRhWcZgzAGcmZqqArTgMLpu90FwthqcCX4ldDgKXbmVMeuQ==" rel="stylesheet" href="" /><link crossorigin="anonymous" media="all" integrity="sha512-1KkMNn8M/al/dtzBLupRwkIOgnA9MWkm8oxS+solP87jByEvY/g4BmoxLihRogKcX1obPnf4Yp7dI0ZTWO+ljg==" rel="stylesheet" href="" /><link data-color-theme="dark_dimmed" crossorigin="anonymous" media="all" integrity="sha512-cZa7DZqvMBwD236uzEunO/G1dvw8/QftyT2UtLWKQFEy0z0eq0R5WPwqVME+3NSZG1YaLJAaIqtU+m0zWf/6SQ==" rel="stylesheet" data-href="" /><link data-color-theme="dark_high_contrast" crossorigin="anonymous" media="all" integrity="sha512-WVoKqJ4y1nLsdNH4RkRT5qrM9+n9RFe1RHSiTnQkBf5TSZkJEc9GpLpTIS7T15EQaUQBJ8BwmKvwFPVqfpTEIQ==" rel="stylesheet" data-href="" /><link data-color-theme="dark_colorblind" crossorigin="anonymous" media="all" integrity="sha512-XpAMBMSRZ6RTXgepS8LjKiOeNK3BilRbv8qEiA/M3m+Q4GoqxtHedOI5BAZRikCzfBL4KWYvVzYZSZ8Gp/UnUg==" rel="stylesheet" data-href="" /><link data-color-theme="light_colorblind" crossorigin="anonymous" media="all" integrity="sha512-3HF2HZ4LgEIQm77yOzoeR20CX1n2cUQlcywscqF4s+5iplolajiHV7E5ranBwkX65jN9TNciHEVSYebQ+8xxEw==" rel="stylesheet" data-href="" /><link data-color-theme="light_high_contrast" crossorigin="anonymous" media="all" integrity="sha512-+J8j3T0kbK9/sL3zbkCfPtgYcRD4qQfRbT6xnfOrOTjvz4zhr0M7AXPuE642PpaxGhHs1t77cTtieW9hI2K6Gw==" rel="stylesheet" data-href="" /><link data-color-theme="light_tritanopia" crossorigin="anonymous" media="all" integrity="sha512-AQeAx5wHQAXNf0DmkvVlHYwA3f6BkxunWTI0GGaRN57GqD+H9tW8RKIKlopLS0qGaC54seFsPc601GDlqIuuHg==" rel="stylesheet" data-href="" /><link data-color-theme="dark_tritanopia" crossorigin="anonymous" media="all" integrity="sha512-+u5pmgAE0T03d/yI6Ha0NWwz6Pk0W6S6WEfIt8veDVdK8NTjcMbZmQB9XUCkDlrBoAKkABva8HuGJ+SzEpV1Uw==" rel="stylesheet" data-href="" />

I also tried having the local files into my Google Drive, but for some reason I had issues reading in the text files. In the future if I’m working with larger and more numerous text data, I presume I wouldn’t be uploading them to Google Drive then reading the local files from Colab, so I want to properly learn how to import text files from Github or another source.

The tutorials seem to have their files in Google API storage, but when I tried uploading my files there, it was extremely slow.

Is there another method generally used for ML models like this or with text-based work?

maybe anyone can share some tips here.

This issue has been solved in this Stackoverflow page.