The Past Tense of Touch – User Experience, Story Design

The final experience: A procedurally-driven storytelling engine and virtual reality experience, The Past Tense of Touch appropriates the visual language of 3D video games and the gestural language of cinema, combining the two in unexpected ways. Each time the piece runs, the setting and characters are different, drawing from a repository of stock 3D game assets. The characters’ gestures and related actions with respect to one another and the user are determined in real time, each time creating new narratives and plotlines. The user’s gaze, position, and gestures will have the potential to heighten tension between the two characters and incite different dramatic twists and turns in emergent plotlines. Drawing on a dataset of gestures from a set of films, in which the gestures have been removed from their context and applied (via motion capture data) to 3D model characters, each emerging narrative will inherently explore themes of power, intimacy, love (triangles) and expressions of violence.

The three phases of development:
Phase I: Transposing cinematic motion capture onto stock 3D models, letting full scenes play out between 2 characters (from two different films). The output is a non-interactive 3D scene to be experienced in VR so the characters can be tracked. However, there is no interaction component at this phase.

Phase II: Same as phase I but instead of a full scene, the individual gestures will be clipped, tagged and procedurally drawn and interpolated from one gesture to the next using an animation blend tree in Unity. This will also be presented in VR but will not have an interaction component.

Phase III: This phase introduces interactivity. The scenes from which the gestures are drawn will likely be limited to those involving a third character. Developing the parameters of interaction will require extensive testing, but the main inputs will be: gaze, position and orientation relative to each character.

Phase I: Story Design
An example: Mapping the movement and gestures of two scenes
Thelma and Louise
scene 01:16m in when Louise is washing her face, sees an elderly man sitting about 15 feet away. She approaches him, they gesture hello to one another and Louise sits down on a crate next to him. She begins to remove her jewelry and, looking him in the eyes, hands it over to him. In this scene, she is active, he is passive.
scanebreakdown_montage_thelma

Ex Machina scene in which Nathan shows Caleb his bedroom. Caleb takes a look around, Nathan stands there, stoic. He has Caleb sit down at the desk to read the NDA and then lies down on the bed. Caleb hunches over to read the NDA, he questions it. Then Nathan stands up, standing above him. Caleb is passive and remains hunched over as he makes the decision about whether to sign it. And he eventually reveals a feeling of excitement, his posture straightens, as he decides to sign it in order to have access to the development of the AI.
scanebreakdown_montage_exmachina

Research
Scene-by-scene breakdown of scenes from selected films. Organized by if there are exactly two characters in the scene, and then by the gender of each of the characters.

 

Photogrammetry: shooting best practices for 3D reconstruction

This tutorial is intended for photographers – or people familiar with DSLR cameras and lenses – who will be shooting photo sets intended for 3D reconstruction using photogrammetry.

WHAT IS PHOTOGRAMMETRY? WHY PHOTOGRAMMETRY?
Photogrammetry is “the science of making measurements from photographs, especially for recovering the exact positions of surface points.” Historically, applications have included geology, archeology and architecture.

Deville


PBC GIS

More recently, however, this technique is being appropriated by digital media artists and developers as a means to create photorealistic, volumetric 3D reconstructions of spaces and objects for virtual reality storytelling. Think of it as, for example, a means for digital “set design” or the basis for a given digital environment. When used for this purpose, photogrammetric 3D digital environments can create a context where other types of content can be layered in, such as audio, 2D video or 2D still images, to create a truly immersive experience. And, when experienced in virtual reality, the viewer has the ability to control his or her experience of this space in a non-linear, explorative manner.

This project, Exquisite City, created by Specular Studio is a good example of what can be done with photogrammetry for artistic, or culturally relevant pieces.

HOW DOES IT WORK?
In order to “make measurements from photographs,” one must shoot a set of photos in a particular manner.

Photogrammetry requires photos of a subject to be taken from several different, yet overlapping angles. Photogrammetry software then looks for the same feature points in multiple photos.

Clemson

By tracking the placement of a specific feature within a photograph (and thus the relative difference of its placement between photos), the software uses trigonometry to determine the relative measurements of such features and, collectively, of an object or a structure.

Here is an example of a 3D model and the positions from which the photos were taken:
Screen Shot 2015-11-07 at 4.38.48 PM

A handful of software programs are available for photogrammetric reconstruction. I prefer Agisoft’s Photoscan for quality and a well-designed user interface and workflow. I’ll cover the full Photoscan workflow in a future tutorial, and here will focus on how to shoot optimal image sets for the software to work with.

SHOOTING BEST PRACTICES
Shoot with a DSLR if you have access to one. (If not, a camera with manual settings is ideal, but a high quality mobile phone camera will suffice as a backup option but is not ideal.) You’ll want to keep your settings set to “manual” and consistent among all photos in the photo set. And, use the same lens for all the photos in the photo set as well, so select the lens most appropriate for your shooting environment/subject. Autofocus is okay.

A photo set should cover the entire object or surface. If you’re shooting a single surface, such as a building facade, your shooting angles should look like this:
Screen Shot 2015-11-07 at 4.59.58 PM Screen Shot 2015-11-07 at 4.59.40 PM

If you’re shooting a singular object, your shooting angles should look like this:
Screen Shot 2015-11-07 at 5.00.10 PMAgisoft

Maintain consistent lighting among all photos
Avoid shadows, especially moving shadows like this.
_01changingLight

tested.com

Ensure the entire subject is in focus
Avoid a depth of field shallower than the subject itself. Overcome this with a smaller aperture and a longer exposure.

Also avoid motion blur. Make sure your subject is still and, if your exposure is long, make sure the camera is on a tripod and is completely still.

Avoid reflective surfaces
Remove mirrors or glass objects whenever possible. This may be unavoidable if, for example, your subject is a building facade with windows. In this case, make sure the rest of the features in the photo are in focus and well-lit.

If your subject has several repeating features, make sure there are other non-repeating features in the scene
In this photo, the tree and the bike add non-repetitive features:
IMG_8455_cropped

In this photo, because the hacky sack pattern repeats around the entire object, a non-repeating patterns were added to the stand and the platform:
IMG_2481_cropped copy

A good rule of thumb for assessing the quality of your photoset:

  • Are my photos all in focus?
  • Do they have the proper exposure?
  • Does my photoset cover the entire object or surface? Including the occluded parts? Are the key features present in multiple photos?
  • Is the lighting consistent among all photos? If there are some shadows, are they in a fixed location among all of the photos?
  • Are all of the elements of the scene completely fixed across every photo in the photo set?

If you answer “no” to any of these questions, consider re-shooting your subject or supplementing your photo set with additional photos.

Final – Bedtime Stories for Young, Learning Machines, Part II

For my final, I continued to work with the poetic form I devised for my midterm, Bedtime Stories for Young, Learning Machines. For the midterm, I worked with a random assortment of lists that I collected from the internet (e.g., craigslist postings for athletic equipment, amazon products listed under pet supplies). This approach resulted in some interesting and very strange associations between otherwise unrelated objects. But for Part II I wanted to work with a single text as input.

Working with a single text, I hypothesized, would draw out connections between objects and ideas specific to the world created by that specific corpus. My intention was to treat this more as an exercise in creative reading, but imagining how a machine would engage in the creative reading. And the translation from that reading to the bedtime story is a process of Premastication, in which the parent machine partially chews up the content and feeds it to the young machine in the form of a story.

The texts I chose to work with–John from the New Testament and The Trial by Franz Kafka–are quite different in content than the lists of objects. They are dealing with large, intangible concepts. While the words used in the texts can be found in the dictionary, they are words that every individual human would likely define differently. “God,” “law,” “free country.” Additionally, one might find it challenging to disassociate these words’ definitions from their emotional content or social context. And even beyond that, though many of the concepts may be understood universally among humans, their interpretations will vary greatly among cultures and societies around the world.

My aim is to questions the limitations of the machine in its quest to learn, or its quest to teach its young; and in turn, to question our own limitations in Premasticating difficult concepts for our young. What is pure logic and what eludes logic altogether?


 

Bedtime Stories for Young, Learning Machines: John

This is a god
a god is a god
a god is not a verily
or a sabbath day
a god is a man hath
a god is a god
but not a jesus
because a jesus is a jesus
a god is a god
it is a man hath
because a god Comes.

A god Comes.
It Comes, it Comes, it Comes.
When a god Comes,
it is walking.
That is why a god is a man hath
because it Comes
walking and walking and walking.

You will find a god alongside a god
because a god lives in a thy husband
and a thy husband is a god
so a god will always be a god
and that means it is a rabbi
just a rabbi
walking and walking and walking
because it is a god
walking
and when it is walking, it becomes a messiah
because a messiah Comes
so a god becomes part of a messiah
walking and walking and walking.


 

Bedtime Stories for Young, Learning Machines: The Trial

This is a free country
a free country is a free country
a free country is not a k.
or a someone
a free country is a policemen
a free country is a free country
but not a mrs. grubach
because a mrs. grubach is a franz
a free country is a free country
it is a policemen
because a free country stays.

A free country stays.
It stays, it stays, it stays.
When a free country stays,
it is siting.
That is why a free country is a policemen
because it stays
siting and siting and siting.

You will find a free country alongside a slight advantage
because a slight advantage lives in a k.
and a k. is a free country
so a free country will always be a free country
and that means it is a not stay
just a not stay
siting and siting and siting
because it is a free country
siting
and when it is siting,
it becomes a not stay
because a not stay stays
so a free country becomes part of a not stay
siting and siting and siting.

Link to code.

Homework 5 – Updated midterm code

Link to code

Bedtime Stories for Young, Learning Machines

Bedtime Stories for Young, Learning Machines is a set of poems intended to assist young machines with machine learning. Similar to a young child’s experience of learning to assess the world around it and develop a framework for understanding connections between things, so too do young machines. This set of bedtime stories is intended to be the fun playful counterpart to the hard learning the young machine is engaging in at “school.”

This set of poems follows a linear pattern. The first stanza focuses on defining what the subject of the poem is and is not, via binary relationships. The second stanza discusses what the subject of the poem does and it does what it does. And the third stanza of the poem starts to tease out what this all means and how the subject of the poem exists and coexists in the world.

This is a diplomat
a diplomat is a diplomat
a diplomat is not a flood
or a stock-exchange
a diplomat is a shop
a diplomat is a diplomat
but not a squat-rack
because a squat-rack is a reel
a diplomat is a diplomat
it is a shop
because a diplomat attains.

A diplomat attains.
It attains, it attains, it attains.
When a diplomat attains,
it is losing.
That’s why a diplomat is a shop
because it attains
losing and losing and losing.

You will find a diplomat alongside a yield
because a yield lives in a lantern
and a lantern is a diplomat
so a diplomat will always be a diplomat
and that means it’s a dumbbell
just a dumbbell
losing and losing and losing
because it’s a diplomat
losing
and when it is losing, it becomes a history
because a history attains
so a diplomat becomes part of a history
losing and losing and losing.

Source code for the poem can be found here. The code draws from a file that defines the basic structure of the poem (“x is a x,” “x is not a y”). Each time the code runs, a new subject is defined and the other variable words in the poem get replaced by random selections from a variety of word lists. The word lists (source text) are intentionally all over the place–pet supplies sold on amazon, athletic equipment for sale on craigslist, words that contain “-scope” in them, jobs, news topics, nouns that are verbs, action verbs to use on your resume, and gerunds.

Some additional examples:

This is a toy
a toy is a toy
a toy is not a crack
or a firefighter
a toy is a dentist
a toy is a toy
but not a furniture
because a furniture is a ski
a toy is a toy
it is a dentist
because a toy acquires.

A toy acquires.
It acquires, it acquires, it acquires.
When a toy acquires,
it is saving.
That’s why a toy is a dentist
because it acquires
saving and saving and saving.

You will find a toy alongside a place
because a place lives in a ophthalmoscope
and a ophthalmoscope is a toy
so a toy will always be a toy
and that means it’s a heavy-weight
just a heavy-weight
saving and saving and saving
because it’s a toy
saving
and when it is saving, it becomes a accountant
because a accountant acquires
so a toy becomes part of a accountant
saving and saving and saving.

 


 

 

This is a bike
a bike is a bike
a bike is not a borescope
or a computer-engineer
a bike is a judge
a bike is a bike
but not a trampoline
because a trampoline is a estimate
a bike is a bike
it is a judge
because a bike represents.
A bike represents.
It represents, it represents, it represents.
When a bike represents,
it is reasoning.
That’s why a bike is a judge
because it represents
reasoning and reasoning and reasoning.
You will find a bike alongside a athlete
because a athlete lives in a analyst
and a analyst is a bike
so a bike will always be a bike
and that means it’s a bank
just a bank
reasoning and reasoning and reasoning
because it’s a bike
reasoning
and when it is reasoning, it becomes a lantern
because a lantern represents
so a bike becomes part of a lantern
reasoning and reasoning and reasoning

 


 

This is a comb
a comb is a comb
a comb is not a flea-control
or a surveyor
a comb is a plane
a comb is a comb
but not a harness
because a harness is a comic
a comb is a comb
it is a plane
because a comb launches.
A comb launches.
It launches, it launches, it launches.
When a comb launches,
it is recording.
That’s why a comb is a plane
because it launches
recording and recording and recording.
You will find a comb alongside a proctoscope
because a proctoscope lives in a substrate
and a Grammy is a comb
so a comb will always be a comb
and that means it’s a actor
just a actor
recording and recording and recording
because it’s a comb
recording
and when it is recording, it becomes a recumbent-bike
because a recumbent-bike launches
so a comb becomes part of a recumbent-bike
recording and recording and recording

 


 

This is a sparring-gear
a sparring-gear is a sparring-gear
a sparring-gear is not a terrarium
or a promise
a sparring-gear is a œsophagoscope
a sparring-gear is a sparring-gear
but not a botanist
because a botanist is a pilot
a sparring-gear is a sparring-gear
it is a œsophagoscope
because a sparring-gear assesses.

A sparring-gear assesses.
It assesses, it assesses, it assesses.
When a sparring-gear assesses,
it is processing.
That’s why a sparring-gear is a œsophagoscope
because it assesses
processing and processing and processing.

You will find a sparring-gear alongside a toy
because a toy lives in a frown
and a frown is a sparring-gear
so a sparring-gear will always be a sparring-gear
and that means it’s a studio
just a studio
processing and processing and processing
because it’s a sparring-gear
processing
and when it is processing, it becomes a lamp
because a lamp assesses
so a sparring-gear becomes part of a lamp
processing and processing and processing.

 


 

 

Simple manipulations with the nyTimes API

This week, my primary focus was to get familiar with the process of working with APIs in python. I hope to build on this work conceptually in weeks to come.

Link to code on github.

 

Learning You

$python learningYou2.py

This is a breast
I’ve got it, it’s a breast
So that means it’s
either of two milk-secreting, glandular organs on the chest of a woman
I’ve got it, it’s a close up of a girl with a white shirt
So that means it’s
a female child
It must be a woman in a pink shirt holding a banana
She is a little girl with a white shirt with a smile on her face
She’s a wig
But that would mean it’s
a wig
So that means it’s
an artificial covering of human or synthetic hair
This is a pretty young lady sucking on the end of a banana with a stewart from mad tv hair cut
But that would mean it’s
a lady
So that means it’s
a well-mannered and considerate woman with high standards of proper behavior
I remember this one, it’s a woman in a pink shirt holding a banana
So that means it’s
an elongated, edible fruit, having a thick yellowish skin and white pulp
She is a breast
either of two milk-secreting, glandular organs on the chest of a woman
It must be a hairdo
So that means it’s
the arrangement of the hair (especially a woman’s hair)
That’s it, she’s a woman in a pink shirt holding a banana
a woman
an adult female human
It’s a girl holding a video game with her hand
So that means it’s
a game played against a computer


This procedural poem began with a curiosity in computer vision. I’m fascinated by the process of training a computer to read and derive certain conclusions about a given photograph. Added to my fascination is the trend of deep learning in artificial intelligence, training computers to read photos and tag objects in the scene. I wanted to explore the computer’s process of reading, making connections between a set of pixels, an object and that object’s definition or meaning.

For this assignment, I wanted to create a “portrait of the artist” based on the computer trying to “learn me.” So, I googled my name and selected the first photo of me in google images:

googleImages

I then added the photo into a site that my friend Rosalie Yu recommended, Toronto Deep Learning. This was the image to text result:

deepLearning_site

I had had some original ideas to use this service to create a text version of a celebrity face morph in which I would take the text descriptions of photos of two separate people and create a text mashup of the two. However, the result of this first search seemed worth further exploration in that, I think, it reveals some underlying biases in the text database.

In deriving a programmatic procedure, I had two motivations. The first was to tease out the language used here, to strip away at the logic jumps that we often make and take for granted that others will also make to reveal not just what the language maps to at a base level, but what its usage means about our society’s biases (or perhaps just the biases of those developing this Deep Learning program). We can learn a lot when the technology gets it wrong, or by taking a closer look at what the computer has been trained to focus on. So my second motivation was to give the computer a voice, to empathize with its struggle to identify and think through what it sees, having only a given set of data points to work with and learn from. Because of both the limited size of the data set and the level of human subjectivity, its bound to get some things wrong.

The process:

  • All of the text from my Deep Learning search (in the photo above) got saved to a txt file.
  • I created a text file with various “beginnings” of sentences (“This is a..”, “I’ve got it, it’s…”, “She’s a…”
  • I wrote a script to randomly select a beginning an a Deep Learning description and print a combined line of the two. I ran this script until I was satisfied with the output and saved that output to a txt file named results.txt.
  • Then, I used wordnik.com to retrieve the definitions of key descriptive words in each line of the Deep Learning text, and saved these to txt file.
  • The second script, which output the final result, reads the results.txt file as input. It also reads the key words and the definitions of those key words.
  • The result is a mashup of these three inputs, with the addition of the occasionally added “But that would mean it’s” and “So that means it’s” to give the effect of thought process.

Code is available on github.
I first started out using a dictionary for the words (keys) and their definitions (values), however I was encountering some challenges with these keys and values needing to get added to a list as the program iterated through the lines form the input file. I defaulted to using lists instead, because I got to a result I was happy with and the procedural shift didn’t impact the concept (just dumbed down the program a bit). However, a more elegant version of this code is in the works.

Experiment 2: Focal Stacking with Helicon Focus

The method of focal stacking makes sense when working with microscopes, because of the shallow depth of field. When scanning items with my initial prototype, I would frequently turn out images with up to 1/3 of the subject out of focus. This gets compounded when a hundred of these images are getting stitched together via photogrammetry (PhotoScan). I think my ultimate setup will involve a combination of focal stacking and photogrammetry together.

Eric Rosenthal introduced me to Helicon Focus last semester in Digital Imaging. This software takes a set of photographs (in this case about 20) of the same item captured at different focal lengths. That means that each photograph consisted of in-focus material and out of focus material. The software stitches the in-focus data together to create one fully in-focus photograph.

bead_3Dmodel.obj
Image of a bead compiled of ~20 individual images using Helicon Focus

depth_map
Depth map image of the bead, via Helicon Focus.

opacity_map
Opacity map of bead, via Helicon Focus

sweaterFuzz_3Dmodel.obj
Image of a piece of sweater fuzz compiled of ~20 individual images using Helicon Focus

Additionally, the data of relative focal lengths from photo to photo provides the software enough information to create a relatively accurate 3D surface model. With the USB microscope, I was just manually changing the focus, so the increments of focal length from photo to photo aren’t precise in this first test. That lack of precision got transferred to the depth information in the 3D models (however, I’m not 100% sure how this software is getting and interpreting the data). The intention of the test, however, wasn’t to get 100% precision but instead an indication of whether this is a viable technique and software program to use for my system. The results seem promising, and I’d like to continue down this path.

sweaterFuzz_3DmodelStereo bead_stereo_HeliconFocus

 

The next step in this exploration will be to create 3 or 4 surface models of each item from equal angles around the diameter of the object. So the process would go like this:

  1. Position the item on the scanner
  2. Take 1 photo so that the closest tip of the item is in focus and everything else is out of focus
  3. Change the focus by 1/30th of the full focal range, take another photo
  4. Repeat steps 2 and 3 30 times
  5. Scanner rotates either 120 or 90 degrees
  6. Repeat steps 2 through 5 until turntable has covered the item from 3 or 4 angles (will need to test with both)
  7. Each stack of photos from steps 2 through 4 will be compiled into individual 3D model surfaces.
  8. Somehow, these 3 – 4 3D model surfaces will get stitched together, probably in MeshLab.
  9. (An alternative would be to compile the focal stacked images and use them as JPEGS instead of OBJs (2D instead of 3D). If I have a larger set of compiled images (~50 instead of 3 or 4), I could use these crisper images as input for PhotoScan (instead of the partially out of focus images I’ve been using) and proceed with the same process as in my initial prototype. This would require a tremendous amount of photo data to begin with though and I’m not sure if the process would be more effective. It’s worth an initial test and, if it seems promising I could determine a process for managing the data efficiently).

The Things We Carry Are Us: A Thesis Proposal

Thesis Title: The Things We Carry Are Us   |   Scope3D

Synopsis: The Things We Carry Are Us is a series of 3D printed sculptures that explore our relationship to the microscopic items and beings living on or in our bodies.

To create the pieces I’m building Scope3D, a custom-designed microscopic 3D scanning system.

Description:
The Things We Carry Are Us is a series of 3D printed “portraits,” sculptures built from scaled-up 3D models of material existing on or inside our bodies. The material – hair, fingernails, teeth, sweater fuzz – will be collected from myself, peers and strangers and each sculpture will be presented as a classical still life. Each portrait will contain the same collection of materials for each person, yet the formal differences in material from one person to the next get hyperbolized as details get picked up in the microscopic imaging and get scaled up for final presentation.

With The Things We Carry Are Us, I ask us to reimagine the human identity. What stories about the human experience get uncovered when microscopic forms become structures we can coexist with and view at eye level? How might the finger nails of a machinist vary from those of a hand model? Through this exploration of a layer of our physical world that we don’t often directly interact with, I hope to explore these questions and uncover unexpected narratives of physical identity and human experience.

Scope3D is a microscopic 3D scanning system (physical device and software program) to reconstruct scalable 3D models; to be used as a tool for The Things We Carry Are Us sculptures and packaged as an open source toolkit.

I envision a future (most likely beyond the scope of this thesis period) for this device to be used out in the field by archeologists, scientists and medical practitioners, in classrooms by educators and students curious about life sciences, or in the studios of makers and artists as a tool for inspiration and experimentation.

Personal Statement:
I derive great inspiration from the world of biology, a field in which the more we discover the more we seem to exist in a sci-fi reality that forces us to face existential questions about our perceptions of physical identity and our relationship to the material world.

My inspiration to work with 3D scanning technology was born directly from my skepticism of it. Perusing sites like 123D Catch, I found myself having an internal argument—“wow, this technology is so powerful and promising”—but—“but who really cares if we have a 3D model of the thing I’m holding in my hand already? What can be gained by experiencing this digital 3D reconstruction?”

This led me to microscopy, in which the added third dimension can truly uncover real-world data that we would otherwise have no access to. The forms we can uncover allow us to explore the relationship between scale and understanding, scale and knowledge, scale and fear. When a form’s scale can be easily manipulated, we can start drawing connections between objects in the physical world that may have formerly been overlooked. I’m driven by a sense of childhood curiosity and wonder as I get to discover these connections.

 

Auden, Creative Reading and Python

Based on WH Auden’s A New Year Greeting

A simple experiment with pleasing (for me) results (includes only lines with commas, includes the remainder of the line following the comma):
Yeasts,
Viruses,
in the pools
my purposive acts,
clinging to keratin rafts,
or the Flood
sooner or later, will dawn
too rancid, for you,
and I
subject to Judgment.

(code can be found here – a direct code example from the class notes)

Another simple experiment that I found illuminated the underlying relationship between the speaker and the subjects of the poem (this prints only lines with “my” or “your” and the word following it):

my
greetings
my
ectoderm
your
size
my
pores
my
fore-arms,
my
scalp.
your
presence,
my
inner
my
rocketing
my
games,
your
dramas
your
priests
my
mantle

(code can be found here – a direct code example from the class notes)

Inspired by the theme of a life cycle/evolution in Auden’s poem:
$ cat 1born.txt 2grow.txt 3mutation.txt 4decay.txt

y traditi
taking stock
ngs t
a, Viruses
and Anaero
ppy New
or wh
as Middle-Eart
or creatures yo
hoice o
settle y
that suits y
my pores or
of arm-pit a
he deserts of m
ol woo
nies
quate warmth my greetings to all of you, Yeasts,
my ectoderm
I offer
my pores or the tropical
my fore-arms,
my scalp.
I will supply
me annoy with your presence,
my inner weather affect
my rocketing plunge
I should like to think that I make
my games, my purposive acts,
me I dress or undress,
I dress or undress,
I bathe?
my mantle suddenly turnsallots tradition day On this
our lives, to stock taking of
all greetings to you, my Yeasts, of
Bacteria, Viruses,
Aerobics and Anaerobics:
A Happy Year Very New
all ectoderm for my to whom
me. is as Middle-Eart to
creatures I size For offer your
choice habitat, free of a
zone the in yourselves settle so
that the best, suits in pools you
of the tropical my pores or
forests crotch, arm-pit and of
of the in fore-arms, my deserts
cool or of scalp. woods my the
I supply Build colonies: will
adequate and warmth moisture,
you lipids sebum need, the and
never on you condition
with do presence, annoy me your
but good behave should, as guests
into not acne rioting
or or a boil. athlete’s-foot
Does my inner affect weather
where live? surfaces the you
Do unpredictable changes
record my plunge rocketing
the in from tift when mind fairs is
and throughts relevant occur
to will fouls when nothing happen
one calls it no and rains. and
I that should make to think like I
not a impossible world,
cannot Eden it but an be:
acts, my games, purposive my
there. to turn may catastrophes
were you religious If folk,
dramas how justify would your
unmerited suffering?
would priests myths By your account what
for the come that hurricanes
twenty-four every hours, twice
I or undress, time dress each
when, clinging rafts, keratin to
whole swept away are cities
or space, in to perish Flood the
bathe? that when I death to scalds
sooner Then, or later, will dawn
Day Apocalypse, a of
suddenly mantle when my turns
rancid, you, too too for cold,
predators appetizing to
I sort, of and a fiercer
of and excuse stripped am nimbus,
Past, a to Judgment. subjectmy gve acts,
may tre.
If you us folk,
how woul justify
unmeriing?
By wha account
for thcome
twice rs,
each tss,
when, afts,
whoay
to Flood
that sca bathe?
Th dawn
a Day ofe,
whe turns
too colu,
appeators
of a , and I
am strs,
a Past,ent.

(the code for this poem can be found here – 1born.py, 2grow.py, 3mutation.py, decay.py)

Another simple experiment that I found illuminated the underlying relationship between the speaker and the subjects of the poem (this prints only lines with “my” or “your” and the word following it):

my
greetings
my
ectoderm
your
size
my
pores
my
fore-arms,
my
scalp.
your
presence,
my
inner
my
rocketing
my
games,
your
dramas
your
priests
my
mantle

(code can be found here – a direct code example from the class notes)

The original text:
A New Year Greeting
by WH Auden

On this day tradition allots
to taking stock of our lives,
my greetings to all of you, Yeasts,
Bacteria, Viruses,
Aerobics and Anaerobics:
A Very Happy New Year
to all for whom my ectoderm
is as Middle-Earth to me.

For creatures your size I offer
a free choice of habitat,
so settle yourselves in the zone
that suits you best, in the pools
of my pores or the tropical
forests of arm-pit and crotch,
in the deserts of my fore-arms,
or the cool woods of my scalp.

Build colonies: I will supply
adequate warmth and moisture,
the sebum and lipids you need,
on condition you never
do me annoy with your presence,
but behave as good guests should,
not rioting into acne
or athlete’s-foot or a boil.

Does my inner weather affect
the surfaces where you live?
Do unpredictable changes
record my rocketing plunge
from fairs when the mind is in tift
and relevant thoughts occur
to fouls when nothing will happen
and no one calls and it rains.

I should like to think that I make
a not impossible world,
but an Eden it cannot be:
my games, my purposive acts,
may turn to catastrophes there.
If you were religious folk,
how would your dramas justify
unmerited suffering?

By what myths would your priests account
for the hurricanes that come
twice every twenty-four hours,
each time I dress or undress,
when, clinging to keratin rafts,
whole cities are swept away
to perish in space, or the Flood
that scalds to death when I bathe?

Then, sooner or later, will dawn
a Day of Apocalypse,
when my mantle suddenly turns
too cold, too rancid, for you,
appetising to predators
of a fiercer sort, and I
am stripped of excuse and nimbus,
a Past, subject to Judgement.

http://allpoetry.com/A-New-Year-Greeting