Model to generate very long text sequences

I’ve recently downloaded TensorFlow with the intent of using it via PyCharm. I’m seeking to have an AI learn to write a television script, as is popular on the internet these days, and initially found this tutorial. It is, of course, very outdated, and not useable with my Windows PC, but was useful in giving me a pre-written code, where I simply had to replace the input text with my own. It seems that the code as a whole, though, is contained within a folder, rather than any individual file, and I haven’t been able to figure out how to import this training code into the program. Does anyone know how I could manage?

Hi,

To give you some more resources on generating text:

also, it would be good to also take a look on using Google Colab for development. It’s a free env that makes managing dependencies very easy.

1 Like

So would this allow me to have the AI generate a script based on the text I’ve given it? Ex; like that AI-made Harry Potter book, or the AI-generated movie/television scripts that have been circling in recent years?

High-quality text generation requires large-size pretrained language models like BERT or GPT-3. You should not use any code examples older than 1-2 years if you want to implement the model to write TV scripts or any other literary texts.
Depending on the size of RAM on your computer, it might be the case that not all state of the art models could be used for text generation on your PC. Check the size of the model and take into account that for prediction on any input data it will require even mode space. Try using Google Colab first.
Pretrained models could be found at TensorFlow Hub
Also you can check out transformers library. It allows to download many models in tensorflow format or use them via simple pipeline. For text generation you can try GPT Neo (GPT Neo).

2 Likes

Just to add a little bit to the excellent answer from Ekaterina,

Generating text that follows a cohesive story is very hard.
For snippets or verses the challenge is much smaller and you can get some good results, for a Harry Potter book it would be very hard (if even possible at this time).

2 Likes

Yes handling a long context it is an open problem and an active research topic.

E.g. see
https://arxiv.org/abs/2109.00301

1 Like