GPT-2 Beatles song inspired by Bohemian Rhapsody
This song composed by GPT-2, trained on The Beetles back catalogue lyrics, inspired by the opening lines of Queen’s Bohemian Rhapsody: Is this the real life? Is this just fantasy? This feels […]
Data Scientist by day Gamer by night
This song composed by GPT-2, trained on The Beetles back catalogue lyrics, inspired by the opening lines of Queen’s Bohemian Rhapsody: Is this the real life? Is this just fantasy? This feels […]
Here we go, never before heard songs by The Beatles… written not by Paul & John, but an AI algorythm (see what I did there) π Data grabbed from here: Beatles lyrics […]
After a couple of conversations with different mates over the weekend I was motivated to dust off my old GPT-2 scripts and play a little more. Libraries have changed in the near […]
Great news from openai regarding their GPT-2 model, they’ve released their 345M parameter version π GPT-2 Interim Update, May 2019 Weβre implementing two mechanisms to responsibly publish GPT-2 and hopefully future releases: […]
Thought I’d ask the Trump Tweets Fine Tuning model a few questions π All results un-edited Model prompt >>> build a wall to protect against new Mexican people breaking into our country! […]
Currently working on a new bot, gpt-2 117M Fine Tuned on Trump Tweets π Extracted tweet history from the Trump Twitter Archive, limiting the extract to solely Trump tweets. This gives me […]
Given what I’ve learned over the past few days, and the success of the Trump bot, I thought I’d try modelling the Mike Hosking data again; extracts below π This is certainly […]
Today I decided to try and turn the gpt-2 model to Trump’s speeches, and the results are scarily successful. Here is the file of the Trump 2016 Presidential Election Speeches that I […]
I’ve pivoted with the theme for the text I’m using to fine tune openai’s gpt-2 model, and have picked up a few tips doing so, and I now have Tensorflow gpt-t writing […]
Using Google’s CoLab [https://colab.research.google.com] I’ve fine tuned the gpt-2 117 model with an extract of text from a kiwi radio host, Mike Hosking I chose Mike not because I ever agree with […]