Here is a list of the philanthropies and charities where we donated this December 31st, after a messy year of the Dumpster administration:
My paranoia level on this project is about a seven out of ten, where one on the scale would be a trusting grandma and ten is well-informed tinhat.
Lately I have been wanting one of the genetic testing startups to process my spit, to confirm my profound, stunning whiteness. Though regardless of how many data sharing agreements and EULAs they sign, I do not trust any entity with my genetic details, government or private. An old friend who works in human genetics briefly tried to mellow my concerns, but after a cocktail-or-two revised his stance with “yeah, I guess they could do nasty things with your insurance premiums.” So I set out to get a genetic test done anonymously. This is far from easy, but very doable.
First start with the basics: Confirm the price of the test, including any tax or shipping of the spit-kit back & forth. Recently the 23andMe ancestry test was about $100, to which you might add $35 for extras. Go pull this money out of an ATM in cash. Then stop by your neighborhood pharmacy and buy a debit gift card. Just stating the obvious, but do not buy the gift card with a personal credit card, since you want no digital connection between your actual identity and the payment.
Next find a public computer where you can browse the Internet. It has gotten more difficult to find traditional Internet cafes in big cities, since public wifi is everywhere… Dressing well and asking to use a fancy hotel’s business center is pretty easy. Do not use your computer at work or your machine at home. Not even with your browser in porn mode, nor using the awesome Brave project. A Tor setup might work, or a burner phone with a decent mobile browser. Though make sure you buy that burner phone with cash.
On your public computer, create a throwaway email account. Nowadays, Yandex is your best choice without requiring an SMS validation code. Obviously you want to use invented contact information on the email account.
Sign up for the genetic testing service with the new email account, and pay for the spit-kit with your gift card.
Next is the most delicate part of this whole process: You will need a mailing address to receive the package, but an address not tied directly to your identity. Maybe a doorman building, a large office where you know the people in the mail room, or the like. Find a mailing address shared by a bunch of people, and use this address “care of,” to ship the spit-kit.
If you want to spend any remaining credit on your gift card, make sure to drain exactly the remaining balance and do any top-up with cash. Do not top-up a gift card transaction with your personal credit or debit card.
Now after the package arrives, you may need to associate the kit with your particular genetic testing account. This typically means entering a barcode number into their website. Make sure to use that public computer (again) for doing this association. I believe genetic testing company fulfillment services do not automatically associate spit-kits, to leave the startup the option of quickly handing out a zillion kits, as conference swag.
Spit in your kit, and drop the package in a public mailbox.
In a month or two, your results should be ready. Stop by a public computer for a third time to download the reports and print them. Maybe scan those PDFs at home, and pitch the burner phone. Viola: Very nearly fully anonymous genetic testing!
Breaking this anonymity would probably require security video analysis, or a very compromised public computer. At that point, all bets are off since you are at the level of a three letter agency, and you have bigger concerns than which part of Europe your ancestors crawled on first.
Menu notes from dinner at Sons & Daughters in San Francisco on October 20th, 2017.
- Aperitif was a glass of Cava
- Taste was a leek and beluga caviar tartlet (crunchy)
- Glass sphere w/ celery broth, dehydrated okra seeds, chestnut puree, and hipster bacon
- Salad of abalone & cabbage, w/ black garlic puree & mild pistachio butter
- Broccoli rabe, radish, tomatillo salsa (very off, chemically flavor)
- Delicata squash roasted & pureed, linguini of Granny Smith apples, shaved dehydrated foie gras
- Very al dente purple barley w/ lobster mushrooms & dark roasted mushroom broth (barely there tarragon)
- Bavette steak (yawn) stuffed w/ truffles & salsify a few ways
- Set “Japanese cheesecake” of Big Rock Blue w/ quince (awful, sent it back)
- Limequat ice cream, buckwheat honey, fennel meringue
- Sous vide sesame cake w/ dehydrated buttermilk sprinkles, agastache greens, frozen carrot puree
- Bookended meal w/ a chocolate & honey sphere tartlet
NLP analysis done on a dataset of about 8,000 transcripts of Dumpster back to 2007. (As in “Trump’s a Dumpster-fire.”) Unfortunately there are no trends that obviously jump out. He has probably been keeping to book more closely generally expected, at least in these prepared interviews.
Here are three early transcripts: “xx00133” from Showbiz Tonight (CNN in 2006), “xx00598” from Your World with Neil Cavuto (Fox in 2009), and “xx00911” from Nightline (ABC in 2009):
How has the implied grade-level and complexity of Dumpster’s speaking changed over time?
How has his information content changed over time (empirical bag-of-words entropy)?
If I assume the distant past is the benchmark for Dumpster authorship, does the recent speaker seems like the same person? (This is function word distribution.)
Menu notes from our dinner at Motoi in Kyoto, Japan on May 13th, 2017.
- aperitif: Rice flour dumpling
(deep-fried rice flour dumpling, stuffed w/ a bit of sweet bean paste, wrapped in prosciutto)
- amuse-bouche: Firefly squid, beans, potato mouse
(flute w/ white potato mousse, green peas, broad beans, squid, topped w/ a soft consomme jelly)
- porc: Baked pork back ribs Cantonese style
(small slices of tea-marinaded fatty pork w/ crisped skin, strawberries & Italian basil)
- pousse de bambou: Kyoto’s fresh bamboo shoot, wakame soup
(lukewarm wakame soup w/ fresh bamboo & shiitake mushrooms & sansho leaf, confused but tasty)
- asperge blanche: White asparagus
(shredded white asparagus, noodles, caviar, edible flowers w/ thin onion-y aioli)
- ris de veau: Sauted sweet bread and herb salad
(sauteed sweetbreads, bitter green leaves, balsamic vinaigrette drizzled at the last minute)
- poisson: Panfried Japanese bluefish, Kyoto’s bracken, butter sauce
(wild bracken, onion bulb heads, beurre blanc w/ tomato concasse)
- boeuf: roasted Ozaki beef
(rare, tendon-y wagyu beef, fiddlehead ferns, white onion, w/ cherry demi sauce)
- dessert-1: Walnuts with lemon
(walnut ice cream, lemon granita, icy)
- dessert-2: Banana, coconutscream, rasberry [sic]
(coconut & raspberry frozen cream wrapped in a brown banana fruit leather, candy-like)
- dessert-3: Miyazaki’s mango
(mango, Campari liquid nitrogen granita, meringue shingles, white miso whipped cream, fromage blanc ice cream)
(tea & chocolate macarons, champagne meringue cookies, cannelle, also rosewater jelly, chocolate & coconut truffle, cinnamon curl cookie)
We had dinner at Atalier Crenn on Friday night, and here are details on her menu:
- White chocolate shell filled w/ cider, topped w/ creme de cassis jelly (Kir Breton / “Spring has come with its cool breeze”)
- Trout roe in a tiny buckwheat cheese tart, and black truffle & citrus salad w/ greens (Citrus, Golden Trout Roe, Black Truffle / “Orbs of the air, earth, and sea coalesce”)
- Shreds of fried potatoes w/ seaweed powder and gold flake & smoked trout w/ foie gras mousse & foie gras crunchy skin & Greek yogurt cream (Fish & Chips / “In search of those swimming creatures, tasty and crispy”)
- Leek, fennel, (olive?) oil broth, sushi rice paddy w/ kombu, butter poached sea urchin w/ sesame seeds (Koshihikari Rice, Wakame, Barigoule / “Come with me and look into the golden light”)
- Caviar w/ rice cream (koji?) & salty, buttery, seaweed-crusted rutabaga (Caviar, Rutabaga, Koji / “A burst of oceanic feeling, salty black pearls”)
- Abalone slices w/ oyster cream, egg yolks & brioche w/ fine herbes butter, whipped beef fat butter (Abalone, Roasted Garlic, Oyster Cream / “The whimsically ebullient blue umami”)
- Morels w/ lardo & parmesan custard, pine nuts, smoked creme fraiche spheres (Morrel, pint [sic] Nu, Parmesan Custard / “Earthly song of the elfin singers”)
- Wagyu beef, pickled carrot jelly, edible flowers, roasted chicken cognac sauce (A-5 Wagyu, Foraged Spring Herbs, Carrot Veil / “Under a shroud stirs the tender-footed beast”)
- Harbison cheese tart, onion marmalade, quince, zucchini weave cover (Cow’s Milk Cheese, Quince, Onion Marmalade / “Green lattice, in dulcet reminiscence”)
- Pistachio ice cream “olive” (green tea?) olive oil (Recreated Olive / “A precious token”)
- Chestnut, sage cream in little chocolate egg shells & fillo ‘maki’ wrapped around yogurt, apple, fennel & blood orange ice, rosette of something pickled (Egg of Chestnut & Sage / Toasted Fillo, Yogurt, Apple, Fennel / Blood Orange Ice / “Walking deep in the woods” / “Strolling on, into the orchard” / “As the earth might have something to spare”)
- Sorrel, mint sponge w/ pine nuts, blackberries reconstructed from spheres, stuffed w/ ice cream & shaved dark citrus cookie shaved like truffle (“The Forest” / “Spring has come and is full of sweet surprises”)
- Tree of meringue cookies w/ calabash (?) jam, raspberry w/ chocolate jellies, nougat squares & box of chocolates, a white chocolate bark, white chocolate w/ coffee bonbon, Peruvian dark chocolate square truffle (Mignardises / “Sweetness, bounty, thanks”)
- Granola sticks to takeaway
Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway. [Andy Tanenbaum, 1989]
As someone who did a lot of computing before The Cloud or Dropbox was a thing, I have a little box of hard drives tucked away in my living room. A bunch of these drives will be paperweights by now, the ball bearings frozen-up or platters otherwise unreadable, but I would happily pay for the salvageable data to be thrown up on Amazon for posterity and my own nostalgia. I tried trickle-copying the data over our Sonic DSL connection, but things were happening at a geologic time scale. Enter Snowball, Amazon’s big data transfer service. You sign up and the service piggy-backs on your usual Amazon Web Services (AWS) billing & credentials. Then they ship you a physical computer, a 50 pound honking plastic thing that arrives on your doorstep via two-day UPS:
The first thing I noticed was a cleverly-embedded Kindle that serves as both shipping label and user interface:
The plastic enclosure itself opens DeLorean-style to reveal a handful of spooled cables:
You plug the Snowball into your normal 120V AC mains power, and boot the thing:
Next you install some AWS software on another machine on your network, and then use that software to copy data over the network to the Snowball itself:
Tucked away inside is a serious amount of disk storage, 50 terabytes in the case of the Snowball I tried. The device itself is an intimidating “engineering sample,” whatever that means:
This is where I noted the first serious snag in my plans: The Snowball relies upon your own (home) network for data transfer, which puts a bandwidth bottleneck at your router. My suddenly-beleaguered Netgear thing was tapped-out within moments, and installing Linux on the router (WW-DRT) would not have gotten me further than a 2x speedup.
Also the Snowball client runs on another machine on your network, which is not much of a limitation when used in an institution. However I was copying data from an external hard drive sitting in a SATA IDE to USB 3.0 adapter thing, which put another bottleneck and layer of complexity at the USB port.
Why not just interface my external hard drives directly to the Snowball? Or maybe even install the hard drives as, temporary, internal disks within the enclosure? The enclosure is almost hermetically sealed (“rugged enough to withstand a 6 G jolt“), and exposes only Cat 5 and fiber network ports.
Here is me telling the Snowball via its command-line client that it is ready to be returned to AWS in Oregon:
So! I found the Snowball to be a relatively sophisticated and honest approach to the realities of the Internet bandwidth vs. storage size growth curve. However it is not a good solution for those of us wanting to upload a bunch of rotting hard drives to The Cloud. Amazon has a legacy service that accepted shipped disk drives directly, but I believe it has gone away. On the other hand, I expect Snowball to be a very efficient and slick solution for most organizations. But for the guy sitting on some dusty hard disks, it did not get the ball rolling.
Last month was the two year anniversary of the website Hipsteraunt, which I built with my friend Lance Arthur. He did the design, I did the random menu generation. It is a quirky bit of AI and NLP under-the-hood, so a user gets menus featuring free-range suspended chicken feet, truffled shisito pepper with achiote, and marshmallow crudo, at a place with an ampersand in its name. The inspiration had been a particular dinner out in San Francisco, at an immensely overrated restaurant. But it could have been Brooklyn or the West Loop. I am a quant & machine learning researcher by happy vocation, but also a chef by training. (Le Cordon Bleu with honors, thank you.) So the term “foodie” has always struck me as what privileged folks call themselves when they like to eat fancy food, but would not be caught dead hanging out with a line cook.
Hipsteraunt remains a tender satire of a certain sort of fetishized dining out. It was meant to be an acerbic call to check-your-privilege, together with a reminder that nothing in food is new. No combination of ingredients or flavors has not been tried a thousand times before. Even offal and the Asian flavors everyone loves to exoticize. (Awkward…) We lived through the fusion cuisine of the 1980s, remember? In hindsight, it might have cut a bit too close to the bone. The site garnered plenty of attention, but less heady pokes like the fake Guy Fieri menu and the brilliant Jacques le Merde have been far more successful. An annoying bug with making menu URLs permanent snagged things up the first couple weeks, too. Nonetheless on Hipsteraunt’s second birthday, I celebrate by raising an artisanal cocktail (a lemongrass aviation, perhaps) and toasting the addition of a few new ingredients: Keep an eye out for those trendy signifiers of faux-edgy cuisine we all love, like burrata and purslane, za’atar and togarashi. Goodbye ginger, goodbye almond milk. But it looks like bacon is still there.
Breath Catalogue is a collaborative work by artist/scholars Megan Nicely and Kate Elswit, and data scientist/interaction designer Ben Gimpert, together with composer Daniel Thomas Davis and violist Stephanie Griffin. The project combines choreographic methods with medical technology to externalize breath as experience. Dance artists link breathing and movement patterns in both creation and performance. In Breath Catalogue, the goal is to expand the intrinsic dance connection between breath and gesture by visualizing and making audible the data obtained from the mover’s breath, and inserting this into the choreographic process to make the breath perceptible to the spectator. To do so, they are working with prototypes of breath monitors from the San Francisco-based startup Spire. Following the San Francisco premiere, Katharine Hawthorne interviewed Ben Gimpert to understand the inner workings of the technology interaction.
Katharine Hawthorne: What is the output of the breath sensor (what does it “measure”), and how does this get manipulated or translated into the visualizations?
Ben Gimpert: The sensor measures four things: the diaphragmatic or chest pressure placed on the device, as well as three dimensions of acceleration. These four numbers are sampled about thirty times per second, and then sent over Bluetooth radio to a laptop.
Is there latency in the sensor, in other words, how quickly is information transmitted and processed?
There is very little latency between sampling and receiving the data via Bluetooth on the computer. However, there are lot of complications. First the Bluetooth transmitter in the breath sensor can be easily disrupted or interfered-with by other radio frequency devices. Ironically, a dancer’s body can also block the radio transmitter in the device.
There is also an important but nuanced frame-of-reference problem when using this sort of sensor in performance: The breath sensor does not know the Euclidean origin of the space, what acceleration might occur at point (0, 0). It similarly does not know what is the beginning or end of a breath’s pressure. For this reason, the different breath visualizations avoid working with much memory of a breath. They always work from the difference between this moment’s breath pressure, and the last moment one thirtieth of a second ago. For the mathematically inclined, the viz uses plenty of moving averages and variance statistics. These moving averages give an intentional sort of latency, as Kate or Megan’s movement eases into the visuals.
I am curious about how you chose the specific graphics and visuals used in the piece (the lines and the other projected images).
The famous Joy Division album cover. Smoky particles at a rave in the nineties. The dancers wanting their breath to leave an almost-real residue in the space.
In each case the breath is not visualized literally, because that would be boring. If the pressure sensor has a low reading, suggesting that Kate or Megan is at an inhale, the code might move the frequency blanket imagery in a snapped wave upward. Or invert the breath by sending the neon bars outwards.
Relatedly, how much did you collaborate with the lighting designer on integrating the data visualizations into the overall visual landscape of the performance?
Alan [Willner] was great. He designed the lighting based on videos we sent him of the piece and the visualizations ahead-of-time.
Who is driving the collaboration? Did the dancers/choreographers suggest modes of interaction and then the visuals develop to suit the choreography? Or did the possible visualizations shape the movement landscape?
I have seen a lot of contemporary dance where an often-male technologist projects his video onto usually-female dancers. This is both sloppy politics, and pretty lazy. I wanted there to be a genuine feedback loop between what my code would project in the space, and how Kate and Megan move. So I was in the dance studio with the dancers throughout the creation of the piece.
Can you provide an example of a section where the “movement” led the development and/or a section where the “tech” led? I want to understand this feedback loop better. How was this process different than a traditional dance/tech collaboration?
The tech side of a typical tech/dance collaboration starts with an existing piece of software like MaxMSP or Isadora. The tech person puts together a couple cool looking visualizations, and then brings these along to the studio. In rehearsal, the visualizations are typically put on in the background while the dancers “interpret” or literalize the visualization with their bodies. This produces a lot of great looking stuff, but there is very little feedback going either direction. In Breath Catalogue, we developed a custom piece of software specifically for the piece. This custom approach with a hardware prototype like the sensor and avoided a proprietary (commercial) software dependency. In a very practice-as-research sense, I would often make live changes to the code while in the studio. The Breath Catalogue visualizations run in a web-browser, so it was easy for Kate and Megan to run them outside of the studio. at home. We are planning to release the Breath Catalogue software under an open source license, to support the community. (Some utility is already released on Github.)
How much communication occurs between you and the performers throughout the performance?
Quite a bit. The breath sensor was an unpredictable aspect of the performance, but we three did not want to fake it. So we decided to err on adaptivity instead of pre-recording everything, and this meant a lot of thumbs-up & down cues during the transitions which Hope Mohr noticed for her review. Some of our music was cued off of Kate or Megan taking a certain shape, while at other points the dancers were waiting on the sensor’s connection.
There’s a moment in the piece when the Megan takes off the sensor and transfers it to Kate. Is their breath data significantly different? Also, has this moment ever caused any technical difficulties? Does the sensor have to recalibrate to a different body?
Yes, Kate and Megan each have a distinct style of breathing. If you are adventurous, this can be teased out of the breath data we posted online. In this piece, Megan’s breath is usually more staccato and Kate’s sustained. The sensor reconnects at several points, which is technically challenging. In the next iteration of Breath Catalogue, we will be using multiple sensors worn by one or more dancers. The visualization software that I built already supports this, but it is trickier from a hardware standpoint.
In your experience, how much of the data visualizations translate to the audience? How easy is it for an untrained eye to “get” what is going on and understand the connection between the performer’s breathing and the images?
It turns out to be quite difficult. We added a silent and dance-less moment at the beginning of the piece so the audience could understand the dancer’s breath’s direct effect on the viz. Yet, even with that, the most common question I have been asked about my work with Breath Catalogue was about the literal representation of the breath. As contemporary dance audiences, we are accustomed to referential and metaphorical movement. However I think visualizations are still expected to be literal, like an ECG. Or just decorative.
What is your favorite part of the piece?
In the next-to-last scene, the wireless pocket projector was reading live sensor data from the dancer via the attached mobile phone. Which was pretty fucking tough from a technical standpoint. Also the whimsical moment when Kate watches and adjusts her breath according to the baseline of that Police song. And when Megan grabs the pocket project for the film noir, and then bolts.
If you had the time to rework or extend any section, which would it be?
In one scene we remix the live breath data with data from earlier in that evening’s show. I would have made this more obvious to the audience, because it could be a pretty powerful way to connect breath and time passing.
The great Dinah Sanders does an annual blog post with her election picks, which is incredibly useful for navigating California’s referendum system. In this vein, here is a list of the philanthropies and charities where we donated this December 31st:
- Organization for Black Struggle (25%), an old-school post-Black Power organization addressing the asshattery in Ferguson, MO. Open Society Foundations just gave them a lot of money to run with.
- Missourians Organizing for Reform and Empowerment (10%), a new-school group addressing the asshattery in Ferguson, MO. Open Society Foundations just gave them a lot of money to run with.
- California State Parks Foundation (10%), to protect some of the most beautiful places on Earth.
- Doctors Without Borders (25%), big and famous for a reason, dealing with Ebola well.
- OneVoice International (15%), supporting a two-state solution in Israel, smartly.
- Girls Who Code (15%), young girls need to see serious, non-feminized (“softened”) science as an awesome career.