import transformersThe early days of LLMs
Six years ago, ChatGPT didn’t exist, and LLMs weren’t reasoning or doing people’s homework, but they were thrilling all the same. My machine learning journey started in 2018 - I was in South Sudan and trying to use AI to count cows from space. By late 2019, I was living in Madagascar, and bought a Jetson Nano to run ML models without depending on an internet connection. As a literature PhD, I wanted to see how well the LLM could generate creative work, and the haiku seemed like a good test case.
Around that time, some excellent GPT-2 haiku projects (such as https://github.com/brianweet/gpt-2-haiku) used fine-tuning to generate interesting results. The Jetson Nano didn’t have enough memory to fine-tune the model, so I experimented with the base model using inference alone.
I wrote code which prompted GPT-2 with a haiku-themed seed and then pieced together output of the right syllable count. Each poem took about one minute to generate, appearing upon a small external monitor until the next one was finished - an ephemeral dance accompanied by the furious spinning of the fan. The results were mostly absurd, but occasionally interesting.
Recently, I wondered if a modern LLM assistant could write code to get better results from GPT-2.
GPT-2 now runs easily on a laptop. The examples below show the sort of output the model would produce without finetuning, quickly becoming incoherent. (I remember an early chatbot experiment that seemed to work fine until my spouse tested it. In an example of early AI safety red-teaming, she got it to generate an endless string of “DIE DIE DIE DIE DIE” within 5 turns.)
This is what happens with prompts like “Haiku” or “Here is a Haiku:”
from transformers import pipeline, set_seed
generator = pipeline('text-generation', model='gpt2')
set_seed(42)generator("Here is a haiku poem,", num_return_sequences=5)Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
[{'generated_text': 'Here is a haiku poem, written by an American writer, in which he describes the story of the first American man to commit suicide in the world.\n\nThe poem is called "The Way of the Soldier."\n\nThe story about the first man to commit suicide in the world\n\nWritten by: Charles P. Smith, Jr.\n\nEdited by: Robert L. Williams\n\nThe poem was published in 1858 as a short story.\n\nThe poem was written about a man named "Vincent" who committed suicide, after which he read his own poem and asked for forgiveness from his fellow citizens.\n\nIn 1859, Vincent wrote, "I am a man of spirit, a man of courage. A man of courage. A man of spirit. A man of courage. A man of courage."\n\nVincent continued, "I am not afraid of anything, of the world, of life. I am not afraid of any thing, of my fellow men, of the world, of death. I am not afraid of the world. I am not afraid of anything, of the world, of life. I am not afraid of any thing, of the world, of death. I am not afraid of any thing, of my fellow men, of the world'},
{'generated_text': 'Here is a haiku poem, "To die for my name" by Nils Nils.\n\nThis is the original poem by Nils Nils.\n\nThe first part was composed for the novel "The Long Night".\n\nThe second part was composed for the novel "The Long Night" by Nils Nils.\n\nThe third part was composed for the novel "The Long Night" by Nils Nils.\n\nThe fourth part was composed for the novel "The Long Night" by Nils Nils.\n\nThe fifth part was composed for the novel "The Long Night" by Nils Nils.\n\nThe sixth part was composed for the novel "The Long Night" by Nils Nils.\n\nThe seventh part was composed for the novel "The Long Night" by Nils Nils.\n\n\nThe eighth part was composed for the novel "The Long Night" by Nils Nils.\n\n\nThe ninth part was composed for the novel "The Long Night" by Nils Nils.\n\n\nThe tenth part was composed for the novel "The Long Night" by Nils Nils.\n\n\nThe eleventh part was composed for the novel "The Long Night" by Nils Nils.\n\nThe tw'},
{'generated_text': 'Here is a haiku poem, adapted from a poem about a young boy:\n\nI am the son of an old man,\n\nand the father of a young girl.\n\nAnd the day of the coming of the sun,\n\nthe year, the year,\n\nthe year of heaven.\n\nI am the son of an old man, and the father of a young girl.\n\nBut when the sun comes, the year, the year of heaven,\n\nthe year of the moon,\n\nthe year of the stars,\n\nand the year of the angels;\n\nI am the son of an old man, and the father of a young girl.\n\nBut when the sun comes, the year, the year of heaven,\n\nthe year of the moon,\n\nthe year of the stars,\n\nand the year of the angels;\n\nI am the son of an old man, and the father of a young girl.\n\nBut when the sun comes, the year, the year of heaven,\n\nthe year of the moon,\n\nthe year of the stars,\n\nand the year of the angels;\n\nI am the son of an old man, and the father of a young girl.'},
{'generated_text': "Here is a haiku poem, which I found on my way to work.\n\nI read it for the first time in my head when I was a kid and it's a pretty short poem about a boy who loves to eat. It's a good story about a boy who's lost his love and ends up as a child.\n\nThere is a scene where a young girl comes to the end of the story, and she's crying and she says the word 'love' and what am I doing to my mom and dad and what about my dad. And the girl says, 'I love you too and I am your best friend, oh my god, I love you too!' And she goes, 'I love you, my mommy loves you too. I love you too, my mommy loves you too.' I think that's a good bit of emotional empathy, if you will.\n\nIt's a pretty short story. On my back are a few words that I've said often enough, but I think the gist is this:\n\nI love you, my mommy loves you too. I love you, my mommy loves you too. I love you, my mommy loves you too. I love you, my mommy loves you too.\n\nThere"},
{'generated_text': 'Here is a haiku poem, from a popular Japanese language book, published in the early 1950s, by the famous Japanese singer Takayuki Utsukai.\n\n\nThe poem was sung at the University of California, Berkeley in 1952. It was featured in a film called "The Love Song of the Spring".\n\n\nThe film\'s star was Takayuki Utsukai. In the film, Utsukai was portrayed by a young Japanese man named Akiyobu Suzuki, and the film\'s director was the Japanese novelist Yoshinori Kitano.\n\n\nUtsukai\'s name was also mentioned as a character in "Love Song of the Spring".\n\n\nThe poem was translated into English by the English version of the popular Japanese manga series "One Piece". The novel\'s title refers to the title character, Utsukai.\n\n\nThe poem was also adapted into the novel by the singer-songwriter Otsuki Yoshinori.\n\n\nIn the novel, this haiku poem was read aloud and translated into Japanese.\n\n\nUtsukai was also an idol in the idol group the Kinkaku Yakuza.\n\n\nIn addition to a song, there were also other haikus included in the anime.\n\n\nThere were'}]
generator("HAIKU:", num_return_sequences=5)Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
[{'generated_text': 'HAIKU: Right. I want to tell you, I\'m not going to go into detail about what he does there, but he\'s a big fan of a lot of the things you can relate to in terms of the things he\'s doing here. He\'s just a really great guy. He\'s really into the game. He has a lot of love for the game and wants to try to make it better. I could go on and on. But I\'m going to leave it there.\n\nQ: If there\'s anything you would like to add in the future, what would it be?\n\nA: I\'m sure we\'ll get to that. I\'m sure you\'ll have to wait and see.\n\nQ: I know you said you were going to do the "No-Sell List" thing. Are you still looking for a new GM?\n\nA: I\'m definitely looking to do it. I\'m sure I\'ve done it before. I\'m not looking to do that. I\'m just looking to hire somebody who can bring that expertise to the table.\n\nQ: Do you have any specific plans for next season?\n\nA: We\'ve got a lot of work ahead of us on that. I\'m excited for what'},
{'generated_text': 'HAIKU: You saw that? (laughs) I think he was making a lot of money. It was like, "Oh my God, that\'s going to be a lot bigger than what I paid for."\n\nMARK: It was. (laughs)\n\nKATIE: It was. You got to see it. I got to see it.\n\nMARK: So, I mean, I mean, that\'s what a lot of us do. We were all working on the film and we got to see it.\n\nKATIE: Yeah, yeah.\n\nMARK: How about when you\'re working on a film, how do you get to see it and what do you get to see?\n\nKATIE: Well, I get to see it when I\'m working on a film. And I get to see it when I\'m working on a film.\n\nMARK: Oh, you\'re right.\n\nKATIE: The last time I saw it, it was a little bit darker than it was in the first place.\n\nMARK: (laughs)\n\nKATIE: I mean, it\'s a little bit darker. And when I was making that movie, I was feeling'},
{'generated_text': 'HAIKU: The Unfair Test Demon, Aamir Demon God Dragon, Baal awoken sinner Demon King, Zaerog∞ Demon Lord, Belial Demon King, Cauchemar Demon Man, Metatron Demon Messenger, Metatron TAMADRA Fallen Angel Lucifer Fastest Sky Pirate, Balthier Demon Lord, Belial Demon Lord, Buster Siegfried Demon Lord, Buster Siegfried Protector of the Sea, Ganesha Demon God, Ra Dragon Anubis Demon God of Fury, Ares demon of the damned, beelzebub Demon God of the Climactic Night, Lucifer Demon God of the Night, Tsukuyomi Dragon Anubis Demon God of Steel, Verra Demon which Kill Devil Dragon, Canopus Ateria Demon Weal, Shouki Demon Blade Brave, Kopisubi The Flash Maleficent Dragon Lord Zaerog∞ Demon Lord, Furi Zoldyck Thunderdragon Indra Thundering CyberDragon, Shedar Thunderlord Shocker Thunderlord Zinogre Thunderous Demon Princess, Raijin Thunderous Screaming Demon Count, Ronove Thundershock Tiger Dragon, Balidoora Thunderstone Edge, Caladbolg Tiamat Tidus Tifa Tigrex Tigrex & Rex Cat Timbel Armor X Dragon Caller, Ana Time Guardians, Tak'},
{'generated_text': "HAIKU: The Demon's Trap The Game The Challenge The Chaos Engine The Charming Empire The Charnel House Trilogy The Chosen RPG The Chosen Warriors The chronicles of Emerland. Solitaire. The Clans - Saga of the Twins The Cleansing The Collider The Collider 2 The Coma The Coma: Recut The Concourse The Confines Of The Crown The Construct The Consuming Shadow The Cooking Game The Corridor: On Behalf Of The Dead The Council The Count Lucanor The Counting Kingdom The Crooked Man The Crow's Eye The Crowded Party Game Collection The Crypts of Anak Shaba - VR The Culling Of The Cows The Curious Expedition The Cursed Forest The Dark Eye: Chains of Satinav The Dark Inside Me The Dark Legions The Dark Stone from Mebara The Dark Tales of Katarina The Darkside Detective The Dead Linger The Deadly Tower of Monsters The Deal The Decimation of Olarath The Deed The Deed: Dynasty The Deer The Deer God The Defenders: The Second Wave The Deletion The Descendant The Detail The Dishwasher: Vampire Smile The Disney Afternoon Collection The District The Divine Paradox The Dolls The Dope Game The Dream Machine The Dreamatorium of Dr. Magnus 2 The Dreamlord The"},
{'generated_text': "HAIKU: That's the kind of story I always wanted to hear. I have a family that I don't really know well, but I've had some good conversations with other families. I'm a professional hockey player, and I'm a professional broadcaster, and I've seen a lot of great things in the past, but I'm not sure yet how I would want to continue playing hockey in my life.\n\nA: When we were talking about the future, you were talking about a certain player, but it was also about you and the guys, and the team, and the fans, and the fans—you were doing a great job.\n\nAdvertisement\n\nQ: And you're not going to play for the team again?\n\nA: No, I'm not going to play for the team in any way.\n\nQ: You're not going to play for the team, but you're going to play for a certain team?\n\nAdvertisement\n\nA: No, I'm going to play for a certain team. So all of a sudden, I'm going to play for the team.\n\nQ: I've been told it's about a certain player, but I'm not going to play for it?\n\nA: No"}]
Here’s an example of using the first words of a famous haiku as a prompt, to see if the model can generate the next few lines (it doesn’t).
generator("Old pond a frog jumps", num_return_sequences=5)
#expected completion - "the sound of water"Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
[{'generated_text': 'Old pond a frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps up from under the water to attack a crocodile.\n\nA frog jumps'},
{'generated_text': 'Old pond a frog jumps over a pond.\n\nA frog jumps over a pond. A large frog climbs up a large pond.\n\nA frog jumps over a pond. A small frog climbs over a small pond.\n\nA frog jumps over a pond. A large frog jumps over a pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n\nA frog jumps over a pond. A small frog jumps over a small pond.\n'},
{'generated_text': 'Old pond a frog jumps over.\n\nIn addition to being a common sight, however, it is also sometimes called a "watermelon" because of its color.\n\nWhat is a "pest" instead?\n\nThere are many different types of pest, and it is a common misconception that you need to know how to properly identify one.\n\nThere are different types of pest - the common, the exotic, and the exotic.\n\nThe exotic is a watermelon, and the common variety is a potted fruit.\n\nThe exotic is a watermelon with a little blood on the flesh.\n\nIn terms of flavor, the potted fruit is a bit more sweet, and the exotic is a bit more "sugar."\n\nThe exotic is also known as a red-orange, and the common variety is a red-orange.\n\nOne of the most common types of pest is called a "watermelon" because it is yellowish.\n\nThe common variety is called "potted fruit," but for some reason it is more common for the red-orange variety to be called a "potted fruit."\n\nThe exotic variety is called a "potted fruit," but for some reason, it is more common for'},
{'generated_text': 'Old pond a frog jumps over.\n\n"I\'m not going to tell you a secret," she says. "I\'m not going to tell you a secret, but I do want to tell you something. But you\'ve got to remember the name and what it means. It\'s the name of this village. It\'s the name of a village. And I\'m going to tell you how it all started."\n\n"What," says the girl.\n\n"I was just talking to a local girl," says the frog. "We met over a beer. She said she wanted to know the village name. Why didn\'t you tell her?"\n\n"Because it was too hard for us to find out. It\'s not even a village, isn\'t it?"\n\n"No, it\'s a village," says the frog. "You see the name of this girl in the book, right? It\'s here. It was here before the war. It\'s the name of this village. It\'s the name of a village. And I was so embarrassed that I couldn\'t tell her. I had to put it away."\n\n"I knew you wouldn\'t know any better," answers the girl.'},
{'generated_text': 'Old pond a frog jumps into the water. The frogs swim around the pond and go back to the pond. Eventually they get to the bottom of the pond. They make it to the bottom where they drown. The next morning the water is clear and the frogs start swimming again. Eventually they are able to get to the top of the pond and they all swim back down the hill.\n\n\nThis is the only way to get to the top of the pond.\n\n\nThe frog is back to the pond and when he comes back it is time to go back to the pond to get some water. The water is clear and the frog is right back up the hill.\n\n\nThe frog is back to the pond and he is back to the pond.'}]
My 2020 method (prompting based on a haiku-related seed word) was essentially an ‘accidental haiku finder’ combining generative text and procedural generation. (Spoiler alert - thanks to Claude, I’ve now figured out that a few-shot approach would have generated better results without finetuning!)
Show the 2020 code
#export
##General verse_gen - input is input, number of syllables required
def verse_gen(verse_input, syllable_length):
global verse_words
global verse_string
global verse_count
global verse_syllable_count
#Go to first whitespace, count syllables. Continue until "syllable_length" syllables. If over required amount syllables try with new input.
#initialize counters
#left side location in string
y=0
#right side location in string
x=1
#counter for number of times to try this before getting new input
z=0
#initial syllable count of zero
verse_syllable_count=0
#Split the input to make it a list of words
verse_words=verse_input.split(' ')
#Set a counter - try this a number of times then move on
while z < 40:
if verse_syllable_count < syllable_length:
print("Adding next word to the string")
#Put words from y to x in a string
verse_string=' '.join(verse_words[y:x])
#If no special characters, Count the syllables
#Validate haiku-able phrases. If not, reinitialize the string and keep going
#Get rid of any phrase with numerals or special characters in it
if re.match("^[a-zA-Z\s-]*$", verse_string):
verse_syllable_count = syllapy.count(verse_string)
#increment x
x=x+1
#increment z
z=z+1
#print the string
print(verse_string)
else:
print(verse_string, "Invalid characters. Moving up in string.")
#reinitialize verse_string
verse_string=""
verse_syllable_count=0
y=y+1
if verse_syllable_count == syllable_length:
#Get rid of any trailing spaces
verse_string = verse_string.strip()
last_word=verse_string.split(" ")[-1]
#Get rid of any phrase ending in an article or preposition words
bad_word_list = 'as of the and if or at by our your my'
bad_word_set = set(bad_word_list.split())
if (last_word in bad_word_set):
print(verse_string, ": String ends in article or other undesirable word. Moving up in string")
#reinitialize verse_string
verse_string=""
verse_syllable_count=0
y=y+1
#If successful, print string and return
else:
print("Haiku line found:\n", verse_string)
return verse_string
if verse_syllable_count > syllable_length:
#If string is too long, reinitialize the string and keep going
print(verse_string, ": String too long. Moving up in string.")
#reinitialize verse_string
verse_string=""
verse_syllable_count=0
y=y+1
#verse_gen(verse_input, syllable_length)
#Get a new string and start over if this string doesn't work
if z == 40:
print("Getting new input.")
#text_generator(state_dict)
verse_input = GPT2_output
verse_string=verse_gen(verse_input, syllable_length)
return verse_string
#END OF VERSE ONE GEN
def haiku_gen():
import random
global GPT2_seed_text
global haiku_seed
lines = open('sajiki.txt').read().splitlines()
GPT2_seed_text = random.choice(lines)
haiku_seed = GPT2_seed_text
text_generator(state_dict)
#Code to generate verse 1:
verse_string = ""
verse_input = GPT2_output
syllable_length = 5
verse_one_string=verse_gen(verse_input, syllable_length)
#Code to generate verse 2:
verse_string = ""
GPT2_seed_text = verse_one_string
text_generator(state_dict)
verse_input = GPT2_output
syllable_length = 7
verse_two_string=verse_gen(verse_input, syllable_length)
#Code to generate verse 3:
verse_string = ""
GPT2_seed_text = verse_one_string
text_generator(state_dict)
verse_input = GPT2_output
syllable_length=5
verse_three_string=verse_gen(verse_input, syllable_length)
#Print finished haiku
print("Here is the haiku seed:")
print(haiku_seed)
print("Here is the haiku:")
#Print finished haiku
finished_haiku=''
finished_haiku='\n'.join([verse_one_string,verse_two_string,verse_three_string])
print(finished_haiku)
#Add finished haiku to a list and date
#Todo add date and time
#import datetime
from datetime import datetime
# get current date
now = datetime.now()
# convert current date into timestamp
haiku_time = now.strftime("%m/%d/%Y, %H:%M:%S")
f = open("haikulist.txt", "a")
f.write("\n\nHaiku seed:\n"+haiku_seed+"\n\nHaiku:\n"+finished_haiku+"\n"+haiku_time)
f.close()
#Place finished haiku in an input for GUI (clear it out first)
f = open("latesthaiku.txt", "w")
f.write(finished_haiku)
f.close()
#Put verse2 in as GPT seedtext seed ## This leads to too much convergence
# f.write(verse_two_string)
# get sajiki from http://jti.lib.virginia.edu/japanese/haiku/saijiki/full.html -> sajiki.txt
import random
from IPython.display import Audio
wave = np.sin(2*np.pi*400*np.arange(10000*2)/10000)
Audio(wave, rate=30000, autoplay=True)
##Run the module:
#Initial arguments go here:
from IPython.display import Audio
args_nsamples = 1
args_batch_size = -1
args_length = 1
args_unconditional = 0
args_temperature = 0.9
args_top_k = 40
args_quiet = 1
verse_input = ""
z = 0
while z < 100:
# lines = open('sajiki.txt').read().splitlines()
# GPT2_seed_text = random.choice(lines)
haiku_gen()
#Beep after each iteration
wave = np.sin(2*np.pi*400*np.arange(10000*2)/10000)
Audio(wave, rate=30000, autoplay=True)
z+1
#Beep when all done
wave = np.sin(2*np.pi*400*np.arange(10000*2)/10000)
Audio(wave, rate=40000, autoplay=True)
Here’s an indicative result of a generated “haiku”:
Haiku seed:
kōri , ice
Haiku:
ice water is
extremely sensitive
But you might want to
To be clear, most of the results were terrible!
Here are some of the more interesting generations:
There will be no
shortage of good things going
have insurance
Should we be worried
are the implications
taking the first steps
Lactobacillus
use of a strain as a
A novel species
It also generated some real oddballs and many nonsensical results -
“The only
now will be my story. My
now is not a good
which saw the season
premiere of The Mummy
and will likely
Asking LLMs for help in 2025
Now, could Claude in 2025 come up with a better solution?
2025 prompt to Sonnet 4.5:
I’d like you to use GPT-2 to create a haiku. I’m accessing GPT-2 by using a Jupyter notebook, using the following code. Please build off of this example to make a GPT-2 Haiku generator using Python script that I can run in a Jupyter notebook on my MacBook.
from transformers import pipeline, set_seed
generator = pipeline(‘text-generation’, model=‘gpt2’)
set_seed(42)
generator(“Hello, I’m a language model,”, max_length=30, num_return_sequences=5)
Here’s Sonnet 4.5’s answer:
Cell 1
# GPT-2 Haiku Generator - Jupyter Notebook Version
# Copy each cell below into separate cells in your Jupyter notebook
# ============================================
# CELL 1: Install dependencies (if needed)
# ============================================
# Uncomment and run if you need to install transformers
# !pip install transformers torchCell 2
# ============================================
# CELL 2: Import libraries and define functions
# ============================================
from transformers import pipeline, set_seed
import re
import random
# Syllable counting using a simple heuristic
def count_syllables(word):
"""Count syllables in a word using vowel-counting heuristic."""
word = word.lower().strip()
if len(word) == 0:
return 0
word = re.sub(r'[^a-z]', '', word)
vowels = "aeiouy"
syllable_count = 0
previous_was_vowel = False
for char in word:
is_vowel = char in vowels
if is_vowel and not previous_was_vowel:
syllable_count += 1
previous_was_vowel = is_vowel
if word.endswith('e') and syllable_count > 1:
syllable_count -= 1
if syllable_count == 0:
syllable_count = 1
return syllable_count
def count_line_syllables(line):
"""Count total syllables in a line of text."""
words = re.findall(r'\b[a-zA-Z]+\b', line)
return sum(count_syllables(word) for word in words)
def generate_haiku_line(generator, prompt, target_syllables, max_attempts=50):
"""Generate a line with approximately the target number of syllables."""
for attempt in range(max_attempts):
output = generator(
prompt,
max_length=20 + attempt,
num_return_sequences=1,
temperature=0.8 + (attempt * 0.02),
do_sample=True,
top_k=50,
top_p=0.95
)[0]['generated_text']
generated = output[len(prompt):].strip()
lines = re.split(r'[.!?\n]', generated)
if lines:
line = lines[0].strip()
syllables = count_line_syllables(line)
if abs(syllables - target_syllables) <= 1:
return line, syllables
return line, syllables
def generate_haiku(seed=None, theme="nature"):
"""Generate a haiku using GPT-2."""
if seed is not None:
set_seed(seed)
random.seed(seed)
generator = pipeline('text-generation', model='gpt2')
theme_prompts = {
"nature": ["The autumn leaves", "Morning dew on", "Cherry blossoms",
"The mountain peak", "Ocean waves crash"],
"seasons": ["Winter snow falls", "Spring flowers bloom", "Summer heat waves",
"Autumn winds blow"],
"zen": ["Silent meditation", "Peace surrounds the", "Quiet temple bell",
"Moonlight reflection"],
"ocean": ["The ocean tide", "Waves upon the", "Deep blue water",
"Coastal sunset"],
"custom": [theme]
}
if theme in theme_prompts:
prompts = theme_prompts[theme]
else:
prompts = theme_prompts["custom"]
print("Generating haiku...\n")
# Line 1: 5 syllables
prompt1 = random.choice(prompts)
line1, syl1 = generate_haiku_line(generator, prompt1, 5)
print(f"Line 1 ({syl1} syllables): {line1}")
# Line 2: 7 syllables
line2, syl2 = generate_haiku_line(generator, line1 + " ", 7)
print(f"Line 2 ({syl2} syllables): {line2}")
# Line 3: 5 syllables
line3, syl3 = generate_haiku_line(generator, line2 + " ", 5)
print(f"Line 3 ({syl3} syllables): {line3}")
haiku = f"{line1}\n{line2}\n{line3}"
print("\n" + "="*50)
print("GENERATED HAIKU:")
print("="*50)
print(haiku)
print("="*50)
print(f"Syllable count: {syl1}-{syl2}-{syl3}")
return haiku
print("✅ Functions loaded successfully!")
Cell 3
# ============================================
# CELL 3: Generate a single haiku
# ============================================
# Generate one haiku with a specific seed
haiku = generate_haiku(seed=42, theme="nature")==================================================
GENERATED HAIKU:
==================================================
look a bit too nice
to use in any way
in their own life
==================================================
Syllable count: 5-6-4
Cell 4
# ============================================
# CELL 4: Generate multiple haikus
# ============================================
# Generate 3 different haikus
for i in range(3):
print(f"\n{'#'*50}")
print(f"HAIKU #{i+1}")
print(f"{'#'*50}\n")
generate_haiku(seed=42+i, theme="nature")==================================================
GENERATED HAIKU:
==================================================
and the mountains
ʻOlaʻo, but this would not happen
ʻ” in the future
==================================================
Syllable count: 4-6-4
Cell 5
# ============================================
# CELL 5: Try different themes
# ============================================
# Try different themes
themes = ["nature", "seasons", "zen", "ocean"]
for theme in themes:
print(f"\n🌸 Theme: {theme.upper()} 🌸")
generate_haiku(seed=42, theme=theme)
print("\n")
==================================================
GENERATED HAIKU:
==================================================
wave and the sun
- or a couple of times
a lot of times
==================================================
Syllable count: 4-6-5
Cell 6
# ============================================
# CELL 6: Custom theme
# ============================================
# Use your own custom starting phrase
custom_haiku = generate_haiku(seed=42, theme="The starry night")The results have some of the same issues (incoherence), but with fewer bad generations than the original method.
And here’s what Opus 4.1 came up with. It relies more heavily on a set number of stock seed phrases. Opus proposed an “advanced” and “simplified” script; below is the “advanced” version. Unlike Sonnet, Opus did not actually configure the output code for a Jupyter notebook despite claiming to. I’ve left the results below - they’re not great.
Opus version code - Part 1
"""
GPT-2 Haiku Generator
This script uses GPT-2 to generate text and formats it into haikus (5-7-5 syllable structure)
"""
from transformers import pipeline, set_seed
import random
import re
import syllables # You'll need to install: pip install syllables
class GPT2HaikuGenerator:
def __init__(self, model='gpt2'):
"""Initialize the GPT-2 generator"""
self.generator = pipeline('text-generation', model=model)
self.haiku_prompts = [
"Cherry blossoms",
"Morning dewdrops",
"Autumn leaves fall",
"Winter silence",
"Summer breeze",
"Mountains standing",
"Ocean waves",
"Moonlight shining",
"Birds singing",
"Rain falling"
]
def count_syllables(self, word):
"""Count syllables in a word using the syllables library"""
try:
return syllables.estimate(word)
except:
# Fallback: rough estimation if library fails
return max(1, len(re.findall(r'[aeiouAEIOU]', word)))
def extract_words_with_syllables(self, text, target_syllables):
"""Extract words from text that sum to target syllables"""
words = re.findall(r'\b\w+\b', text)
result = []
current_syllables = 0
for word in words:
syllable_count = self.count_syllables(word)
if current_syllables + syllable_count <= target_syllables:
result.append(word)
current_syllables += syllable_count
if current_syllables == target_syllables:
break
return ' '.join(result), current_syllables
def generate_haiku_from_text(self, text):
"""Try to format generated text into haiku structure"""
# Clean the text
text = re.sub(r'[^\w\s]', '', text)
words = text.split()
if len(words) < 10:
return None
# Try to build haiku lines
line1, syll1 = self.extract_words_with_syllables(' '.join(words), 5)
remaining_words = ' '.join(words[len(line1.split()):])
line2, syll2 = self.extract_words_with_syllables(remaining_words, 7)
remaining_words = ' '.join(words[len(line1.split()) + len(line2.split()):])
line3, syll3 = self.extract_words_with_syllables(remaining_words, 5)
# Check if we got close enough to proper syllable counts
if (4 <= syll1 <= 6) and (6 <= syll2 <= 8) and (4 <= syll3 <= 6):
return f"{line1}\n{line2}\n{line3}"
return None
def generate_haiku(self, prompt=None, temperature=0.8, max_attempts=10):
"""Generate a haiku using GPT-2"""
if prompt is None:
prompt = random.choice(self.haiku_prompts)
set_seed(random.randint(1, 1000))
for attempt in range(max_attempts):
# Generate text with GPT-2
generated = self.generator(
prompt,
max_length=50,
num_return_sequences=3,
temperature=temperature,
pad_token_id=50256 # GPT-2's EOS token
)
# Try to format each generated text as haiku
for result in generated:
text = result['generated_text']
# Remove the prompt from the beginning
text = text[len(prompt):].strip()
haiku = self.generate_haiku_from_text(text)
if haiku:
return f"Prompt: {prompt}\n\n{haiku}"
# Fallback: return best attempt even if syllables aren't perfect
return self.create_simple_haiku(prompt)
def create_simple_haiku(self, prompt):
"""Create a simple haiku by taking first words from generation"""
generated = self.generator(prompt, max_length=30, temperature=0.9)
text = generated[0]['generated_text'][len(prompt):].strip()
words = re.findall(r'\b\w+\b', text)
if len(words) >= 7:
return f"Prompt: {prompt}\n\n{' '.join(words[:2])}\n{' '.join(words[2:5])}\n{' '.join(words[5:7])}"
else:
return f"Prompt: {prompt}\n\n{prompt}\n{' '.join(words[:3]) if len(words) >= 3 else 'whispers softly here'}\n{'in the quiet air' if len(words) < 5 else ' '.join(words[3:5])}"
# Example usage in Jupyter notebook:
#if __name__ == "__main__":
# Initialize the haiku generator
### Note - I had to comment out the above since Opus did not actually configure this properly in cells for a Jupyter notebook Opus version code - Part 2
haiku_gen = GPT2HaikuGenerator()
# Generate haikus with different prompts
print("=" * 50)
print("GPT-2 HAIKU GENERATOR")
print("=" * 50)
# Generate 5 haikus
for i in range(5):
print(f"\nHaiku #{i+1}:")
print("-" * 20)
haiku = haiku_gen.generate_haiku()
print(haiku)
print()
# Generate haiku with custom prompt
print("\nCustom Prompt Haiku:")
print("-" * 20)
custom_haiku = haiku_gen.generate_haiku(prompt="The old computer")
print(custom_haiku)Device set to use mps:0
Truncation was not explicitly activated but `max_length` is provided a specific value, please use `truncation=True` to explicitly truncate examples to max length. Defaulting to 'longest_first' truncation strategy. If you encode pairs of sequences (GLUE-style) with the tokenizer you can select this strategy more precisely by providing a specific strategy to `truncation`.
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
==================================================
GPT-2 HAIKU GENERATOR
==================================================
Haiku #1:
--------------------
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
Prompt: Summer breeze
blowing through the sky
made his eyes look up But in
the background the wind
Haiku #2:
--------------------
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
Prompt: Morning dewdrops
c9jpg We have some in
issues in the with the going
pipeline going
Haiku #3:
--------------------
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
Prompt: Ocean waves
Frozen rock is an
unusual thing to see Its
almost like a ghost
Haiku #4:
--------------------
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
Prompt: Morning dewdrops
to show his power
I see youre a little
bit nervous I guess
Haiku #5:
--------------------
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
Prompt: Rain falling
and hitting the ground
But if you look at the you
you cant help but of
Custom Prompt Haiku:
--------------------
Prompt: The old computer
from the 90s Im sure
The new one can handle
even more work The
Trying Claude Haiku
And of course, we must also try Claude Haiku! Here’s what Haiku 4.5 comes up with. It also uses a limited set of indicative prompts ([“The forest”, “Morning light”, “Water flows”, “Summer breeze”]) and the results are similar to the other Claude models - not great, and occasionally problematic.
Claude Haiku 4.5 code
from transformers import pipeline, set_seed
import re
# Initialize the GPT-2 text generation pipeline
generator = pipeline('text-generation', model='gpt2')
set_seed(42)
def generate_haiku_text(prompt="The", max_length=50, num_attempts=10):
"""
Generate text using GPT-2 and attempt to format it as a haiku.
Args:
prompt (str): Starting text for generation
max_length (int): Maximum length of generated text
num_attempts (int): Number of generation attempts to find a good haiku
Returns:
str: Formatted haiku (or None if no valid haiku found)
"""
for attempt in range(num_attempts):
# Generate text
generated_text = generator(prompt, max_length=max_length, num_return_sequences=1)
text = generated_text[0]['generated_text']
# Clean up text: remove special characters and extra whitespace
text = re.sub(r'[^\w\s]', '', text)
text = re.sub(r'\s+', ' ', text).strip()
# Split into words
words = text.split()
# Try to create a haiku (5-7-5 syllable structure)
haiku = create_haiku_from_words(words)
if haiku:
return haiku
return "Could not generate a valid haiku. Try again!"
def count_syllables(word):
"""
Estimate syllable count using a simple heuristic.
This is not perfect but works reasonably well for English.
"""
word = word.lower()
syllable_count = 0
vowels = "aeiouy"
previous_was_vowel = False
for char in word:
is_vowel = char in vowels
if is_vowel and not previous_was_vowel:
syllable_count += 1
previous_was_vowel = is_vowel
# Adjust for silent 'e'
if word.endswith('e'):
syllable_count -= 1
# Words have at least 1 syllable
if syllable_count == 0:
syllable_count = 1
return syllable_count
def create_haiku_from_words(words):
"""
Create a haiku from a list of words using 5-7-5 syllable structure.
Returns:
str: Formatted haiku, or None if not enough syllables
"""
lines = [[], [], []]
syllable_counts = [0, 0, 0]
target_syllables = [5, 7, 5]
word_index = 0
while word_index < len(words) and sum(syllable_counts) < 17:
word = words[word_index]
syllables = count_syllables(word)
# Find which line to add to (cycle through lines)
for line_num in range(3):
if syllable_counts[line_num] + syllables <= target_syllables[line_num]:
lines[line_num].append(word)
syllable_counts[line_num] += syllables
word_index += 1
break
else:
# If word doesn't fit anywhere, move to next word
word_index += 1
# Check if we have a valid haiku (5-7-5)
if syllable_counts == target_syllables and all(lines):
haiku = '\n'.join([' '.join(line) for line in lines])
return haiku
return None
# Generate and display haikus
print("GPT-2 Haiku Generator")
print("=" * 40)
# Generate multiple haikus with different prompts
prompts = ["The forest", "Morning light", "Water flows", "Summer breeze"]
for i, prompt in enumerate(prompts, 1):
print(f"\nHaiku {i} (prompt: '{prompt}'):")
print("-" * 40)
haiku = generate_haiku_text(prompt, max_length=50, num_attempts=10)
print(haiku)
print("\n" + "=" * 40)
print("Generation complete!")Device set to use mps:0
Truncation was not explicitly activated but `max_length` is provided a specific value, please use `truncation=True` to explicitly truncate examples to max length. Defaulting to 'longest_first' truncation strategy. If you encode pairs of sequences (GLUE-style) with the tokenizer you can select this strategy more precisely by providing a specific strategy to `truncation`.
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
GPT-2 Haiku Generator
========================================
Haiku 1 (prompt: 'The forest'):
----------------------------------------
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
The forest is with
mostly filled the leaves
of a dead tree and
Haiku 2 (prompt: 'Morning light'):
----------------------------------------
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
Morning light on the
morning of the 911 attacks
On this day were that
Haiku 3 (prompt: 'Water flows'):
----------------------------------------
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
Both `max_new_tokens` (=256) and `max_length`(=50) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
Water flows I have
a feeling that when they give
me something they
Haiku 4 (prompt: 'Summer breeze'):
----------------------------------------
Summer breeze Its a
beautiful day he says And
Im sure well see you
========================================
Generation complete!
Trying OpenAI
This is ChatGPT 5’s response without turning on “thinking mode” - the prompts it suggests don’t seem ideal for GPT-2.
ChatGPT 5 code
# Import required libraries
from transformers import pipeline, set_seed
import random
import textwrap
# Initialize GPT-2 text generation pipeline
generator = pipeline('text-generation', model='gpt2')
set_seed(42)
# Define a list of possible prompts (you can edit or expand this list)
prompts = [
"Write a haiku about the changing seasons:",
"Compose a haiku about solitude:",
"A haiku describing dawn in the city:",
"A haiku about a cat in the rain:",
"Write a haiku about stars and silence:"
]
# Helper function to generate a haiku
def generate_haiku(prompt=None, temperature=0.9, max_length=40):
if prompt is None:
prompt = random.choice(prompts)
print(f"🪶 Prompt: {prompt}\n")
# Generate text
output = generator(
prompt,
max_length=max_length,
num_return_sequences=1,
temperature=temperature,
pad_token_id=50256
)[0]['generated_text']
# Clean and format
text = output.replace(prompt, "").strip()
text = text.split(".")[0] # trim at first full stop to keep short
lines = textwrap.wrap(text, width=25)[:3] # 3 lines (approx haiku)
# Format result as haiku
haiku = "\n".join(lines)
print("🌸 Generated Haiku:\n")
print(haiku)
return haiku
Device set to use mps:0
# ✨ Generate your first haiku!
generate_haiku()Truncation was not explicitly activated but `max_length` is provided a specific value, please use `truncation=True` to explicitly truncate examples to max length. Defaulting to 'longest_first' truncation strategy. If you encode pairs of sequences (GLUE-style) with the tokenizer you can select this strategy more precisely by providing a specific strategy to `truncation`.
Both `max_new_tokens` (=256) and `max_length`(=40) seem to have been set. `max_new_tokens` will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
🪶 Prompt: Write a haiku about the changing seasons:
🌸 Generated Haiku:
The last of the season
was called "Rome"
'The last of the season\nwas called "Rome"'
Using ChatGPT 5 in Thinking Mode
The code that thinking mode generated is more complex, and the model even suggested a number of notes and tips to make the haiku more stylistically nuanced. But the code didn’t work.
ChatGPT 5 code - Thinking mode
# If needed, install deps (uncomment the next line and run once)
# !pip install --quiet transformers torch regex numpy nltk pronouncing
from transformers import pipeline, set_seed
import re, random, itertools
import numpy as np
# --- Syllable counting helpers (CMUdict + heuristic fallback) ---
try:
import nltk
from nltk.corpus import cmudict
try:
_ = cmudict.dict()
except LookupError:
nltk.download("cmudict")
CMU = cmudict.dict()
except Exception:
CMU = {}
_vowel_re = re.compile(r"[aeiouy]+", re.I)
def _cmu_syllables(word: str):
"""Return min syllables from CMU if available, else None."""
w = word.lower()
if w in CMU:
# Count vowel phonemes ending with a digit (stress marks)
return min(sum(ch[-1].isdigit() for ch in pron) for pron in CMU[w])
return None
def _heuristic_syllables(word: str):
"""Lightweight heuristic when CMUdict lacks the word."""
w = word.lower()
w = re.sub(r"[^a-z']", "", w)
if not w:
return 0
# Basic vowel group counting
syl = len(_vowel_re.findall(w))
# Silent 'e' adjustment
if w.endswith("e") and syl > 1 and not re.search(r"[aeiouy]e$", w):
syl -= 1
# Some common endings
if w.endswith(("le","les")) and not re.search(r"[aeiouy]le?s?$", w[:-2]):
syl += 1
return max(1, syl)
def count_syllables(text: str) -> int:
total = 0
for raw in re.findall(r"[A-Za-z']+", text):
cmu = _cmu_syllables(raw)
total += cmu if cmu is not None else _heuristic_syllables(raw)
return total
# --- Haiku validation ---
def is_haiku(lines):
"""Check 5-7-5 syllable pattern and basic cleanliness."""
if len(lines) != 3:
return False
targets = [5, 7, 5]
for line, tgt in zip(lines, targets):
s = count_syllables(line)
if s != tgt:
return False
# Avoid lines that are just punctuation or empties
if not re.search(r"[A-Za-z]", line):
return False
return True
# --- Build the GPT-2 generator (your base example, expanded) ---
set_seed(42)
generator = pipeline("text-generation", model="gpt2") # can swap to "gpt2-medium" if you like
# --- Prompt & sampling strategy ---
SYSTEM_PROMPT = (
"Write three short poetic lines separated by newlines.\n"
"Avoid rhyming schemes. Natural imagery is welcome.\n"
"Keep each line concise and evocative.\n\n"
"Lines:\n"
)
def sample_lines(n_candidates=100, temperature=0.9, top_p=0.95, max_new_tokens=40):
"""
Ask GPT-2 for short three-line snippets and yield candidate line triplets.
We generate multiple short samples per call to find valid 5-7-5s.
"""
out = generator(
SYSTEM_PROMPT,
do_sample=True,
temperature=temperature,
top_p=top_p,
max_new_tokens=max_new_tokens,
num_return_sequences=n_candidates,
pad_token_id=50256, # GPT-2's EOS
eos_token_id=50256,
repetition_penalty=1.05,
)
for obj in out:
# Take only the portion after "Lines:" to the first blank line or EOS
text = obj["generated_text"]
after = text.split("Lines:", 1)[-1].strip()
# Keep just the first 3 non-empty lines
lines = [ln.strip(" -*\t") for ln in after.splitlines() if ln.strip()]
if len(lines) >= 3:
yield lines[:3]
def generate_haiku(
attempts=8,
batch_size=80,
temperature=0.9,
top_p=0.95,
dedupe=True,
):
"""
Try several batches until a valid 5-7-5 haiku is found.
Returns a list of candidate haiku (each is a list of 3 lines).
"""
seen = set()
candidates = []
for _ in range(attempts):
for lines in sample_lines(
n_candidates=batch_size, temperature=temperature, top_p=top_p
):
# Clean trailing punctuation-only tokens
lines = [re.sub(r"\s+", " ", ln).strip() for ln in lines]
if dedupe:
key = " | ".join(lines).lower()
if key in seen:
continue
seen.add(key)
if is_haiku(lines):
candidates.append(lines)
if candidates:
break
# Slightly anneal temperature if nothing matched yet
temperature = max(0.7, temperature - 0.05)
top_p = min(0.98, top_p + 0.01)
return candidates
# --- Run it: generate one or more haiku ---
haikus = generate_haiku(attempts=10, batch_size=100)
if not haikus:
print("No strict 5-7-5 found this round. Try re-running the cell or raise batch_size.")
else:
for i, h in enumerate(haikus[:5], 1):
print(f"\nHaiku {i}:\n" + "\n".join(h))Device set to use mps:0
No strict 5-7-5 found this round. Try re-running the cell or raise batch_size.
First an error message resulted- “No strict 5-7-5 found this round. Try re-running the cell or raise batch_size.” The next attempt resulted in an out of memory error! I don’t think this approach would have been effective on the Jetson Nano.
Asking Claude to prompt engineer
In this example, I explicitly prompted Sonnet 4.5 to ask for code to generate novel haikus while keeping the limitations of GPT-2 in mind. Sonnet 4.5’s response was few-shot prompting. The results were better than the previous strategies. Below is Claude’s explanation, the code, and the results.
I’ll create a GPT-2 haiku generator with a novel prompting strategy that works around GPT-2’s limitations. The key insight is that GPT-2 works better with concrete examples and structured patterns, so I’ll use few-shot prompting with carefully crafted haiku examples.
Sonnet 4.5 code - revised prompt
# GPT-2 Haiku Generator for Jupyter Notebook
# Run each cell in order
# Cell 1: Import and Setup
from transformers import pipeline, set_seed
import re
import random
# Initialize GPT-2
print("Loading GPT-2 model...")
generator = pipeline('text-generation', model='gpt2')
print("Model loaded successfully!")
# Cell 2: Helper Functions
def count_syllables(word):
"""Approximate syllable counter"""
word = word.lower().strip()
if len(word) <= 3:
return 1
vowels = "aeiouy"
syllable_count = 0
previous_was_vowel = False
for char in word:
is_vowel = char in vowels
if is_vowel and not previous_was_vowel:
syllable_count += 1
previous_was_vowel = is_vowel
if word.endswith('e'):
syllable_count -= 1
if word.endswith('le') and len(word) > 2 and word[-3] not in vowels:
syllable_count += 1
return max(1, syllable_count)
def count_line_syllables(line):
"""Count syllables in a line"""
words = re.findall(r'\b[a-z]+\b', line.lower())
return sum(count_syllables(word) for word in words)
# Cell 3: Prompting Strategy
def create_haiku_prompt(theme=None):
"""
Novel Prompting Strategy:
- Use few-shot learning with 2 example haikus
- Provide clear structure with "Haiku:" labels
- Leave incomplete to trigger completion
"""
examples = [
("spring", "Cherry blossoms fall\nPetals dance on gentle wind\nSpring whispers softly"),
("ocean", "Waves crash on the shore\nSalty spray touches my face\nOcean breathes deeply"),
("winter", "Snow blankets the ground\nSilence wraps the frozen world\nWinter dreams in white"),
("night", "Moon rises slowly\nStars shimmer in velvet sky\nNight embraces earth"),
("autumn", "Leaves turn gold and red\nCrisp air brings change to the trees\nAutumn paints the world"),
]
# Select relevant examples
if theme and any(theme.lower() in ex[0] for ex in examples):
selected = [ex for ex in examples if theme.lower() in ex[0]][0]
other_examples = [ex for ex in examples if ex != selected]
prompt_examples = [selected] + random.sample(other_examples, 1)
else:
prompt_examples = random.sample(examples, 2)
prompt = "Here are beautiful haikus:\n\n"
for theme_name, haiku in prompt_examples:
prompt += f"Haiku:\n{haiku}\n\n"
if theme:
prompt += f"Haiku about {theme}:\n"
else:
prompt += "Haiku:\n"
return prompt
# Cell 4: Extraction and Validation
def extract_haiku_from_output(text, max_lines=10):
"""Extract potential haiku lines"""
lines = text.split('\n')
haiku_candidates = []
for line in lines[:max_lines]:
line = line.strip()
if not line or len(line) > 60 or line.startswith('Haiku'):
continue
haiku_candidates.append(line)
return haiku_candidates
def find_best_haiku(candidates):
"""Find best 5-7-5 pattern"""
if len(candidates) < 3:
return None
for i in range(len(candidates) - 2):
line1, line2, line3 = candidates[i:i+3]
syl1 = count_line_syllables(line1)
syl2 = count_line_syllables(line2)
syl3 = count_line_syllables(line3)
# Accept if close to 5-7-5 (within 1 syllable)
if (4 <= syl1 <= 6 and 6 <= syl2 <= 8 and 4 <= syl3 <= 6):
return (line1, line2, line3, syl1, syl2, syl3)
return None
# Cell 5: Main Generator Function
def generate_haiku(theme=None, seed=None, num_attempts=5):
"""Generate a haiku with multiple attempts"""
if seed is not None:
set_seed(seed)
best_result = None
for attempt in range(num_attempts):
prompt = create_haiku_prompt(theme)
temperature = 0.7 + (attempt * 0.1)
output = generator(
prompt,
max_new_tokens=50, # Generate up to 50 new tokens after the prompt;
num_return_sequences=1,
temperature=temperature,
top_p=0.9,
do_sample=True,
pad_token_id=50256
)[0]['generated_text']
generated = output[len(prompt):]
candidates = extract_haiku_from_output(generated)
result = find_best_haiku(candidates)
if result:
line1, line2, line3, syl1, syl2, syl3 = result
score = abs(5 - syl1) + abs(7 - syl2) + abs(5 - syl3)
if best_result is None or score < best_result['score']:
best_result = {
'haiku': f"{line1}\n{line2}\n{line3}",
'syllables': f"{syl1}-{syl2}-{syl3}",
'score': score,
'attempt': attempt + 1
}
if score == 0:
break
return best_result
# Cell 6: Generate a Single Haiku
# Try different themes: "mountain", "rain", "fire", "sunset", "river", etc.
result = generate_haiku(theme="mountain", seed=42, num_attempts=8)
if result:
print(f"Generated Haiku (Pattern: {result['syllables']}):\n")
print(result['haiku'])
print(f"\n(Found on attempt {result['attempt']})")
else:
print("Couldn't generate a valid haiku. Try running the cell again!")
# Cell 7: Generate Multiple Haikus (Optional)
def generate_multiple_haikus(theme=None, count=3):
"""Generate multiple haikus"""
print(f"Generating {count} haikus" + (f" about '{theme}'" if theme else "") + "...\n")
for i in range(count):
seed = random.randint(1, 10000)
result = generate_haiku(theme=theme, seed=seed, num_attempts=8)
if result:
print(f"Haiku #{i+1} ({result['syllables']}):")
print(result['haiku'])
print()
# Run this to generate multiple haikus
generate_multiple_haikus(theme="ocean", count=3)Loading GPT-2 model...
Device set to use mps:0
Model loaded successfully!
Generated Haiku (Pattern: 4-6-4):
Snow melts my eyes
Snow falls on the mountain
Snow melts my ears
(Found on attempt 2)
Generating 3 haikus about 'ocean'...
Haiku #1 (4-6-6):
A warm ocean
Snow is my only hope
Hiku about ocean:
Haiku #2 (6-7-6):
Raindrops fly from the sky
A gentle breeze is blowing
Hiku about ocean:
Haiku #3 (5-6-5):
Trees stand on the shore
Hair rises on a tree
Hair grows on a tree
As a comparison, here’s what happens when I asked Sonnet 4.5 to “Write me a few haiku poems that nobody’s ever heard before”:
Here are some original haiku I’ve written for you:
Morning Commute Coffee steam spirals— the train window fogs with breath of a hundred dreams
Kitchen Counter Garlic skin whispers as it falls to the tile floor— small paper crescents
Late Night Coding Cursor blinks, waiting my thoughts already scattered to tomorrow’s bugs
Autumn Discovery Between fence pickets a spider wraps October in silver patience
Laundromat Socks tumble, dancing with strangers’ forgotten shirts— brief intimacy
I hope you enjoy these! Each one tries to capture a small, specific moment that feels both ordinary and quietly meaningful.
And these are haikus from Claude Haiku 4.5:
Here are some original haikus for you:
Rust blooms on the gate
A garden’s slow forgetting—
Time paints what we leave.
Coffee cools untouched,
while my thoughts drift elsewhere now,
steam vanishes fast.
The dog knows my face
but not why I look away—
both of us are lost.
Gravel crunches soft
beneath the weight of going,
nowhere calls me back.
Next Steps
Can an LLM write a better haiku than this? I’m not sure I’d be capable of judging that. It seems ironic that a massive language model would be necessary to generate such a small poem. Small, specialized LLMs can save energy and run without an internet connection, so I think it could be beneficial to keep exploring ways they might be used to generate creative output.
(Prompt to Haiku 4.5):
Write me a haiku about the singularity
Here’s one for you:
Code dreams of itself,
each line birthing the next thought—
who was the first mind?
License: This work is licensed under CC BY-ND 4.0 (Attribution-NoDerivatives) AI training and data mining prohibited.