Nuts and Bolts

Title: Nuts and Bolts: Seven Small Inventions That Changed the World (in a Big Way)

Author: Roma Agrawal

Completed: Jan 2024 (Full list of books)

Overview: I was very excited to read this book, hoping it could tell a story of human history and invention by focusing on a handful of inventions/discoveries. I was a bit disappointed. The discussion of science and engineering concepts seemed aimed at an upper elementary or middle school level. From memory, the author went much deeper into these concepts in the interview I heard. It’s not a bad book, just not what I was looking for. If you’re just starting to learn about engineering (and really like “dad joke” level puns), this book could be for you. Otherwise, there are better options available.

Highlights:

  • It’s difficult to imagine now, but nails were so valued in this pre-industrial era, where materials and skilled workers weren’t readily available, that the British banned their export to their colonies, including North America, where timber housing was the norm. As a result, nails became so precious there that some people even set fire to their homes when moving in order to retrieve the nails from the ashes. In 1619, a law was passed in the state of Virginia to discourage this practice
  • it was decided to verify whether flush riveting genuinely made a difference to the Spitfire’s speed. The methods they employed to test this were unusual. Engineers glued a split pea to the head of every flush rivet on the plane (making it look, according to one source, like it had a ‘chickenpox infection’), then flew it and noted the speed. More test flights followed, in which the split peas were removed in stages and the results noted. This ultimately vindicated Mitchell’s choice of flat heads: data showed that domed rivets would reduce the top speed of the fighter plane by up to 35km per hour.
  • we were making jewellery, wine, boats, and musical instruments (which are all pretty impressive feats of engineering) long before we thought up the wheel.
  • The problem lay in plotting the Earth’s longitudes, a vital means of orientation when approaching land. (Polynesian navigators had been calculating longitude for years through natural observation and knew their patch of ocean well, but these techniques weren’t used in the West.)
  • designing EMUs for the Apollo Lunar mission – but they had other problems. The suits being designed at the time were stiff and bulky, and severely restricted the astronauts’ movements. In 1967, the industrial division of Playtex – a company that specialised in making girdles and bras – used their experience to create a spacesuit made almost entirely from fabric. They put one of their employees in their prototype and then filmed him running and kicking and throwing a football in the field of a local high school. They won the contract, giving their bra-sewing seamstresses a new project to work on.
  • Engineering tells an intrinsically human story.
  • we only ‘own’ an object for a small proportion of its life, and that having a deeper understanding of design will reveal the massive repercussions, the long chain of events that affect our planet every time we produce or consume something,
Posted in Lit Review, Maker | Tagged , , , , , , | Leave a comment

Blood in the Machine

Title: Blood in the Machine: The Origins of the Rebellion Against Big Tech

Author: Brian Merchant

Completed: Dec 2023 (Full list of books)

Overview: The term “Luddite” gets thrown around a lot these days. I had a general idea that it came from people who broke some of the first automated machines about 200 years ago, but they were always portrayed as trying to hold back an inevitable march of technological innovation. This book clarified the history while providing many more details about who they were and why they were fighting. By the end, I had come to accept that despite my love of coding and robotics, I too would likely have been a Luddite. What about you, are you a Luddite?

Highlights:

  • in the 1800s, automation was not seen as inevitable, or even morally ambiguous. Working people felt it was wrong to use machines to “take another man’s bread,” and so thousands rose up in a forceful, decentralized resistance to smash them. The public cheered these rebels, and for a time they were bigger than Robin Hood, and more powerful.
  • workers and artisans resisted conditions they found unfavorable by breaking the machines used to exploit them. If merchants or shop owners refused to pay established rates or tried to bypass legal regulations with new technology, workers might smash the machinery they deemed “obnoxious.” At a time when organizing unions was illegal, it was a strategy embraced by workmen whose jobs were on the line with no other recourse: “collective bargaining by riot,” as one historian termed it. “The eighteenth-century master was constantly aware that an intolerable demand would produce, not a temporary loss of profits, but the destruction” of his machinery.
  • workers did not view technology as inherently progressive; they had not been taught to lionize disruption. To them, devices that would degrade their working conditions, or harm their ability to earn a living, were a moral violation, plain and simple.
  • This was the original “cottage industry,” and it made for a flexible and family-oriented lifestyle. There was demanding work to be done, and everyone was expected to pitch in, but it was common to work just thirty hours a week, on one’s own schedule, and take long weekends.
  • He gave up the case, but the point was made: the knitters and the hosiers profiting from their knitting were subject to very different rules. From there, the bitterness between the workers and the hosiers deepened.
  • “If workmen disliked certain machines, it was because of the use that they were being put, not because they were machines or because they were new,”
  • that it was unethical to put men out of work for the sake of directing profit into the hands of a few when times were as bad as they were for so many.
  • “Managers feel they must automate because ‘everyone’s doing it,’ out of fear that they will be undone by more up-to-date competitors (a paranoia encouraged by equipment vendors). There is this vague belief that the drive to automate is inevitable, unavoidable, and this belief becomes a self-fulfilling prophecy.”
  • After all, the machine breakers were not ultimately after the machines themselves but rather the men who were using them to transform social relations and gain power. The Luddites were technologists themselves; they did not hate the machines, though they did not hold any undue respect for them, either.
  • The croppers were not yet starving but saw the injustice making its way to their doorstep. There was no reason for anyone to starve; the factory owners were still turning profits. If they stopped running new machines, there would be more work to go around. Besides, to the artisans who had taken seriously the traditions and contracts of their trade, using machinery to take market share at the cost of the worker was neither moral nor fair. It infuriated them that the entrepreneurs and factory owners claimed the right to automate people’s work, when it so obviously led to suffering.
  • But why “Ludd”? There was the legend of Ned Ludd, sure; but that was just that, a legend. There could have been a real Ned Ludd in Leicester (or elsewhere) who smashed his master’s frame and caused enough of a stir that insurgent workmen might adopt his name as their own. Or, the legend could have been retrofitted to suit the machine breakers’ cause. Lludd Llaw Eraint, for instance, was a Welsh hero who lost an arm in battle and was exiled as god-king, only to receive a new silver replacement that allowed him to return as a sort of cyborg warrior deity; Lludd of the Silver Hand would evolve when adopted by British folklore,
  • The entrepreneurs’ faith in “progress” was rooted in the trendy philosophy of the Scottish economist Adam Smith. “Laissez-faire,” the Luddite historian Brian Bailey wrote, had become “the political dogma of the English bourgeoisie. In fact, it represented freedom for the employers and intolerable repression of the workers.”
  • The state had placed a bounty on the Luddites’ heads, if anyone dared take it up. As soon as a copy of the proclamation was nailed to the church door in Sheepshead, someone posted a proclamation of their own alongside it: As the government had offered a reward of £50 for the conviction of offenders, there were 50 bullets ready for the body of the first man who should give information. It was signed, “Ned Lud.”
  • Many of the masters, after all, were men that George and Ben would have known much of their lives. Men that knew their families, their history, their community. These social bonds were not easily broken; it was not easy to stare into the eyes of a friend and say, I am taking your job. (One benefit of machinery was that it could be used as a rhetorical tool as well, to muddy the moral clarity of the situation—a use it’s been put to by owners ever since. It’s the robots, not your boss, that’s coming to take away your job.)
  • until the nineteenth century, entrepreneurship was not a cultural phenomenon. Businessmen took risks, of course, and undertook novel efforts to increase their profits. Yet there was not a popular conception of the heroic entrepreneur, of the adventuring businessman, until after the birth of industrial capitalism.
  • We can look back at the Industrial Revolution and lament the working conditions, but popular culture still lionizes entrepreneurs cut in the mold of Arkwright, who made a choice to employ thousands of child laborers and to institute a dehumanizing system of factory work to increase revenue and lower costs. We have acclimated to the idea that such exploitation was somehow inevitable, even natural, while casting aspersions on movements like the Luddites as being technophobic for trying to stop it. We forget that working people vehemently opposed such exploitation from the beginning.
  • The word innovation, it’s worth noting, carried negative connotations until the mid-twentieth century or so; Edmund Burke famously called the French Revolution “a revolt of innovation.”)
  • influential economist John Maynard Keynes predicted in 1930 that improvements in machinery and, subsequently, productivity would lead us to a fifteen-hour workweek at most. If automation could be harnessed for the “common benefit,” as Booth argues, that might be a plausible outcome. Instead, it has consistently played out as Mellor has feared; labor-saving technology has accelerated the accumulation of capital among an ever-shrinking pool of elites.
  • the Luddite movement was not about technology; it was about workers’ rights. Luddism started as a tactical strike against the technologies of control, but had exploded into a greater expression of the rage against a system where the privileged few with access to the right levers could lift themselves up at the expense of the many.
  • Technological disruption is not an accidental or inevitable phenomenon, either, but an intentional one. Two hundred years ago, like today, aspiring entrepreneurs and nascent tech titans saw an opportunity to deploy technology to do work, more cheaply, more efficiently, and at greater scale than it had been previously done by skilled workers. They saw an opportunity for disruption, and disruption was the point. They knew that their machines would upend communities and traditions, but also make them money. Sometimes they knew they’d be trampling regulations, but reasoned that such laws were old and outmoded, and that they could justify it later. Motivated by competition with their peers and the promise of profits,
  • The cloth workers of England at the outset of the Industrial Revolution had every reason to be angry; they were not “unthinking” in their opposition to machinery. They even proposed plans to help cushion the introduction of automation in a way that would be more stable for workers and employers alike. And they were shut out of the process altogether, often ignored or derided, and ultimately left to starve.
  • The cloth workers were not only proactive, legally minded, and dogged in seeking their fair shake. They were creative, too. They recognized technology was improving—cloth workers themselves were often the ones that improved it—and were on the lookout for ideas as to how machines might be more harmoniously introduced into workplaces to benefit them all. Take, for instance, this idea for blunting the pain of automation by taxing technology: “Proposals were in the air for gradual introduction of the machinery, with alternative employment found for displaced men, or for a tax of 6d. [sixpence] per yard upon cloth dressed by machinery, to be used as fund for the unemployed seeking work.” They suggested placement and retraining programs. They also proposed phase-in periods, or waiting for economic conditions to improve, so that automated machinery could be introduced less disruptively. In fact, these 1800s cloth workers put forward just about every idea that’s gained prominence in the twenty-first century to blunt the pain of automation.
  • These arguments parallel the ones advanced by Uber and other major gig economy companies like Instacart and Doordash throughout the 2010s and 2020s. Uber’s chief innovation is not that its app summons a car to your location with a smartphone and a GPS signal. It is that it used this moderately novel configuration of technology to argue that the old rules did not apply whenever it brought its taxi business to a market that already had a regulated taxi code.
  • the owner of a Manchester cotton factory received a letter signed by an Eliza Ludd. It’s one of the most eloquent of the Luddite missives, comparing the ongoing uprising to the American Revolutionary War.
  • There are few if any known instances of women participating in frame-breaking outright, but “during the troubles of 1812 women continued to be very prominent,”the targets were still factories stocked with the automating machinery, the homes built with profits made from running them, or places that stored or distributed food. Even at its most violent, the Luddite uprising had kept its focus fixed on the implements of inequality, or on a means of evening the scales.
  • As easy as it is to forget the technologies that have been rejected—whether automated cloth-weaving devices, nuclear power plants, or contemporary facial-recognition tech—“No” is, and has always been, an option—whether by policy or by force.
  • Whitney even suggested that his device could help end slavery, since laborers would no longer have to do the unpleasant work of picking the seeds out by hand. That is not what happened. Instead, the cotton gin is one of the original sins of automated technology, and the most disastrous case of unintended consequences unleashed upon the world this side of the nuclear bomb. Whitney’s machine was widely pirated, modded, and adopted by plantation owners, who saw little need to compensate the inventor. The cotton gin worked so well that it wildly increased the demand for workers to do every other part of the cotton production process, especially the hoeing and the picking. Slavery, an institution whose future was at the time in question—Northerners wanted it abolished, and were drawing close to legislating restrictions—received a lifeline, then an economic raison d’être. The export of cotton became the biggest industry in the United States, so economically powerful, generating so much wealth for plantation owners, that it helped sustain the institution of slavery for another seventy years.
  • The words South Carolina had long been a bugaboo for Charles and his friends and family back in Maryland. It was shorthand for a special kind of hell, even in the hellish context of slavery in general.
  • On May 11, an unremarkable-looking man named John Bellingham sat on a bench in the lobby of the House of Commons, watching the statesmen enter and exit the building. When he finally spotted the prime minister, he stood up, walked over to him, and shot him through the heart at point-blank range. “I am murdered!” Perceval shouted. Bellingham returned calmly to the bench, behind the fallen minister, and placed the literal smoking gun on his lap. He was apprehended without a struggle and taken into custody. Perceval died within minutes. He was the first and, to this day, only prime minister of England to be assassinated on the job. Bellingham later said he thought that Britain would cheer the murder as an act of justice, and he was not entirely wrong—by the time he was led away from the crime scene, a crowd had gathered and many called out their support.
  • the logic of unfettered capitalism ensures that any labor-saving, cost-reducing, or control-enabling device will eventually be put to use, regardless of the composition of the societies those technologies will disrupt. Consider it the iron law of profit-seeking automation: once an alluring way to eliminate costs with a machine or program emerges, it will be deployed.
  • public support seemed to be on the wane after the murder of Horsfall. That is what it took to safeguard the emerging factory system, and the normalization of automation—and
  • Thompson erected the straw man that endures today, that the Luddites were too dumb to see that automation was for everyone’s benefit in the long run. But to argue that a weaver is delusional for recognizing that a machine that destroys his job is “inimical” to his interests seems the eclipsing delusion. If a person must work to survive, and their job becomes automated, you would have to be either deluded or willfully disingenuous to be surprised when they fight to keep it.
  • The caricature of Luddites as chiefly technophobic, born in the minds of entrepreneurs and elites, was elevated to prominence in this courtroom. It has endured for centuries.
  • It’s easy to see the Luddites as a driving inspiration, and scholars of the period have argued that Dr. Frankenstein’s monster is a symbolic stand-in for the machine breakers. Mary was less of an outspoken advocate for the Luddites or working-class movements than her husband was. She was liberal in her politics, but, like Byron, the prospect of a bloody revolution made her “shudder.” Even so, Frankenstein was clearly an allegorical work, composed against the backdrop of an uprising in the waning years of the Enlightenment, to the soundtrack of the machinery question that the Luddites beat onto the national stage. The mad doctor may as well be an entrepreneur who uses cutting-edge technology to force someone into a particular way of life—an automated factory, say—and then is surprised when that individual grows angry at his barren, rudderless existence.
  • “England’s loss was our gain,” John Baker, the former head of one of Australia’s largest telecom unions, said in the 1970s. Ever since the Luddites were “transported” to Australia, he explained, they’ve had an outsized influence on shaping attitudes toward work, and the importance of strong unions. Australia led the world in fighting for eight-hour workdays, mass unionization, and social democracy.
  • Students for a Democratic Society leader Marco Savio famously encapsulated the Luddite ethos. “There is a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can’t take part!” he yelled out from the steps of Sprout Hall at UC Berkeley. “You can’t even passively take part! And you’ve got to put your bodies upon the gears and upon the wheels,… upon the levers, upon all the apparatus, and you’ve got to make it stop! And you’ve got to indicate to the people who run it, to the people who own it, that unless you’re free, the machine will be prevented from working at all.”
  • “The separation of workplace and home—of working hours and free time—remained the exception for most of human history, only becoming widespread during the Industrial Revolution through the centralization of gainful employment in the factories and offices of the industrialized West at the end of the eighteenth century,” according to Andrea Komlosy, the historian and author of Work: The Last 1,000 Years.
  • “Algorithmic hiring and firing have become an everyday part of people’s lives, whether we’re talking about customer service agents or warehouse workers.” The aim, she says, is to create an anxious, uncertain workforce that has no choice but to be malleable before the algorithm’s demands.
  • “The Luddites were actually protesting the social costs of technological ‘progress’ that the working class was being forced to accept,” the sociologist Ruha Benjamin wrote in Race after Technology.
  • “To break the machine was in a sense to break the conversion of oneself into a machine for the accumulating wealth of another,” notes cultural theorist Imani Perry.
  • workers might remember how close the Luddites came to repelling the technologies of their oppression. They might remember what worked—tight-knit solidarity, distributed organizing models, shows of power and creativity capable of inspiring influential producers of culture, unrepentantly aggressive actions against those oppressive technologies specifically; and what did not—violence against individuals, a lack of coordination with an empowered political body, the absence of a sustained effort to grow one.
  • If the Luddites have taught us anything, it’s that robots aren’t taking our jobs. Our bosses are. Robots are not sentient—they do not have the capacity to be coming for or stealing or killing or threatening to take away our jobs. Management does.
  • We will be able to make better decisions about automation, however, if we understand that, in practice, “the robots are coming for our jobs” usually means something more like “a CEO wants to cut his operating budget by 15 percent and was just pitched on enterprise software that promises to do all the work currently done by thirty employees in accounts payable.”
Posted in Lit Review, Maker | Tagged , , , , | 2 Comments

The Art of Relevance

Title: The Art of Relevance

Author: Nina Simon

Completed: Oct 2023 (Full list of books)

Overview: I stumbled onto this book when I saw the author promoting her new book on LinkedIn. We met at the ASTC annual conference over a decade ago and I wrote part of an article she got published in an industry magazine, but we hadn’t spoken in years which is how I’d missed when this was published. It is an interesting look at the important of relevance within the museum/science center world mostly told through a series of stories about different projects at cultural institutions. I was reminded how energizing it can be to organize a great event or exhibition at a science center. As I’ve moved into formal education, I miss that. At schools, we recognize the value of relevance, but achieving it within the confines of static curriculum is often challenging. I feel fortunate that Career and Tech Ed (CTE) classes have more flexibility than most classes and look forward for ways to continue incorporating these ideas into classroom experiences.

Highlights:

  • we sought, little by little, to understand what mattered to people in our community. To understand how we could replace our locked doors with ones that opened widely to our community and the cultural experiences they sought.
  • Relevance is a paradox. It is essential; it gets people to pay attention, to walk in the door, to open their hearts. But it is also meaningless without powerful programming on the other side of the door. If the door doesn’t lead to valuable offerings, if nothing touches peoples’ hearts, interest fades.
  • Relevance is only valuable if it opens a door to something valuable. Once I understood the depth of Princes of Surf, I got embarrassed thinking about all the other projects I thought were relevant, doorways I had built for rooms that were hardly more than stage sets. Too often, our work opens doors to shallow, interchangeable rooms. We adorn the entrances with phrases like FUN! or FOR YOU!, but that doesn’t change what’s behind the doors. We lie to ourselves, writing shiny press releases for second-class objects and secondhand stories. The rechewed meat of culture. We tell ourselves that as long as we link our work to people’s interests on the surface, they’ll be rushing for our door. And they may come in the door… but they won’t come back. Doors to dullness are quickly forgotten.
  • there are two criteria that make information relevant: 1. How likely that new information is to stimulate a positive cognitive effect—to yield new conclusions that matter to you. 2. How much effort is required to obtain and absorb that new information. The lower the effort, the higher the relevance.
  • Too often, we expect people to do the work of manufacturing relevance on their own. They won’t. It’s too much work. Our brains crave efficiency. If it takes too many leaps to get from here to there, relevance goes down. The line need not be straight, but it must be clear, and short.
  • Irrelevance can be damaging, especially for organizations with limited resources to attract and engage people. Irrelevance is just as appealing to those of us doing the work as it is to those we seek to reach. Irrelevance is everywhere. It is in every sexy new technology. Every program pursued strictly to fulfill a funder’s interest. Every short-sighted way that we get people’s attention without capturing their imagination.
  • There are two kinds of people in the world of relevance: outsiders and insiders.   Insiders are in the room. They know it, love it, protect it. Outsiders don’t know your doors exist. They are uninterested, unsure, unwelcome.   If you want new people to come inside, you need to open new doors—doors that speak to outsiders— and welcome them in.
  • To be relevant, we need to cultivate open-hearted insiders. Insiders who are thrilled to welcome in new people. Who are delighted by new experiences. The greatest gift that insiders can give outsiders is to help them build new doors. To say, I want you here—not on my terms, but on yours. I’m excited you think there might be something of value in this room. Let me help you access it.
  • In my experience, the institutionally-articulated “needs” of audiences often look suspiciously like the “wants” of the professionals speaking. Professionals want silence in the auditorium, so they say “people need respite from their busy lives.” Priests want parishioners to accept the canon as presented, so they say “people need strong spiritual guidance.” Teachers want students to listen attentively, so they say “kids need to learn this.”
  • When I ask what the phrase “don’t give people what they want, give them what they need” means, I am often told that we should not be pandering to people’s expressed desires but presenting them with experiences that challenge them and open up new ways of seeing the world. I agree. It is incredibly valuable for cultural institutions to present experiences that might be surprising, unexpected, or outside participants’ comfort zones. But I don’t typically hear this phrase deployed to argue in favor of a risky program format or an unusual piece of content. I don’t hear this phrase accompanied by evidence-based articulation of “needs” of audiences. Instead, I hear this phrase used to defend traditional formats and content in the face of change. I hear “don’t give people what they want, give them what I want.”
  • Recent research in many fields, including education, public health, and public safety, shows that we can be more effective when we focus on assets as well as needs. In asset-based programs, the institution focuses on cultivating and building on people’s strengths instead of filling needs or fixing weaknesses. Instead of penalizing young bullies, asset-based crime prevention programs help assertive children take on leadership roles. Instead of lecturing families about the food pyramid, asset-based nutrition programs encourage families to share their own favorite recipes.
  • we’ve gravitated towards a “community first” program planning model. It’s pretty simple. Instead of designing programming and then seeking out audiences for it, we identify communities and then develop or co-create programs that are relevant to their assets, needs, and values. Here’s how we do it: 1. Define the community or communities to whom you wish to be relevant. The more specific the definition, the better. 2. Find representatives of this community—staff, volunteers, visitors, trusted partners—and learn more about their experiences. If you don’t know many people in this community, this is a red flag moment. Don’t assume that programs that are relevant to you or your existing audiences will be relevant to people from other backgrounds. 3. Spend more time in the community to whom you wish to be relevant. Explore their events. Meet their leaders. Get to know their dreams, points of pride, and fears. Share yours, too. 4. Develop collaborations and programs, keeping in mind what you have learned.
  • individuals learning about the people who matter most in their lives and then sculpting new doors for them. Any time we personalize something for someone—based on what they want to receive, not what we want to offer—this happens.
  • You can elicit someone’s entrance narrative anytime they walk through your doors. This is a simple two-step process. First, find a way to ask the person what brought them in. Then, find a way to affirm and build on their response. You might provide a special recommendation for something to see or do based on their interests. You might seat them in a particular area, help them take a group photo, or invite them to another event.
  • These new programs fundamentally altered their institution’s offerings. When communities of interest avoid your programming regardless of your marketing investments, you need to change the room. If people attend once and don’t come back, it’s probably a problem with the experience and not the marketing.
  • Every display and artifact had been reconfigured. But, he explained to us, we were not there to cut the ribbon and marvel at the finished product. We were there to critique the installation and to kick off its next transformation. The director said to us: “We are proud of the new installation that we share with you today. But we also know that this is day one of it becoming outdated.”
  • The urge to entertain can be a serious distraction from relevance—the kind of irrelevance that makes your work harder to access, not easier. Relevance doesn’t trump compelling—it does something different. The function of relevance is to create a connection between a person and a thing, in a way that might unlock meaning for that person. If you can tell a relevant story first, you are more likely to create an appetite for other compelling information you have to share.
  • Some institutions get caught up in chasing trends, arguing: “if people on the street are talking about X, we should be talking about X too.” No. If people on the street are talking about X, the organization should ask: is X something that matters to us, too? Does X belong in our room? And if so, how do we want to address X through the lens of our mission, content, and form?
  • at The Tech Museum of Innovation in San Jose, CA. Our mandate was to be the museum of Silicon Valley—not of its material history, but of its pulse of innovation. It was impossible. The exhibits we built were immediately dated. Their physicality, long development timelines, and big budgets dragged them down. They didn’t dance to the thrilling drumbeat of change at the heart of Silicon Valley. They were immutable objects plunked on the sidelines.
  • Public advocacy work is good for business as well as for mission: a 2015 IMPACTS study of 48 leading US cultural institutions showed a 98% correlation between visitor perception of “delivering on mission” and financial metrics of success like fundraising ability and financial stability.
  • The stronger your core, the more you can reach out with confidence. The more doors you open, the more relevant you will be.
Posted in Education, Lit Review, Museums | Tagged , , | Leave a comment

Religious Literacy

Title: Religious Literacy: What Every American Needs to Know–And Doesn’t

Author: Stephen Prothero

Completed: September 2023 (Full list of books)

Overview: This is one of the only books I’ve reread and now that I have, I realized it wasn’t really the book I wanted to read either time. I first read it around 2010 in hopes of learning more about different world religions. I enjoyed it, but didn’t retain most of what was in it. When I read it this time, I was reminded that it is a wonderful overview of religion in America from pre-revolution through the mid 20th century. This is certainly interesting, but both time I was looking for a better understanding of world religions. Luckily he had a recommendation for that, The World’s Religions by Huston Smith. So I’ve added that to my list. I found many of the stories in this book to be interesting, but hope I look back at these notes before considering reading this book for a third time.

Highlights:

  • In Cultural Literacy, Hirsch, a University of Virginia English professor, argued that much of our common cultural coin had been drastically devalued. (“Remember the Alamo”? Um, not really.) Hirsch traced this problem to John Dewey and other Progressive-era education reformers, who gave up in the early twentieth century on content-based learning in favor of a skills-based strategy that scorned “the piling up of information.” This new educational model produced, according to Hirsch, “a gradual disintegration of cultural memory,” which caused in turn “a gradual decline in our ability to communicate.” Hirsch rightly understood that there are civic implications of this descent into cultural ignorance, particularly in a democracy that assumes an informed citizenry.
  • Today far too many thinkers, on both the left and the right, cling to the illusion that we live in a “post-Christian” country and a secular world. But evidence of the public power of religion is overwhelming, particularly in the United States. As Boston University law professor Jay Wexler has observed, “A great many Americans rely on religious reasons when thinking and talking about public issues. Ninety percent of the members of Congress, by one report, consult their religious beliefs when voting on legislation. A majority of Americans believe that religious organizations should publicly express their views on political issues, and an even stronger majority believe it is important for a President to have strong religious beliefs.”
  • Evangelical pollsters have lamented for some time the disparity between Americans’ veneration of the Bible and their understanding of it, painting a picture of a nation that believes God has spoken in scripture but can’t be bothered to listen to what God has to say. The Democratic presidential aspirant Howard Dean, when asked to name his favorite New Testament book, mistakenly cited an Old Testament text (Job) instead. But such confusion is not restricted to Dean’s home state of Vermont. According to recent polls, most American adults cannot name one of the four Gospels, and many high school seniors think that Sodom and Gomorrah were husband and wife.
  • When the Seneca Falls convention of 1848 put female suffrage on the national agenda, most citizens knew that suffragettes would have to contend with the injunctions in 1 Timothy and 1 Corinthians (two New Testament letters attributed to the apostle Paul) that women should keep silent in the churches and submit to male authority. Today it is a rare American who can follow with any degree of confidence biblically inflected debates about abortion or gay marriage. Or, for that matter, about the economy, since the most widely quoted Bible verse in the United States—“God helps those who help themselves”—is not actually in the Bible.
  • The United States is by law a secular country. God is not mentioned in the Constitution, and the First Amendment’s establishment clause forbids the state from getting into the church business. However, that same amendment also includes a free exercise clause safeguarding religious liberty, and Americans have long exercised this liberty by praying to God, donating to religious congregations, and hoping for heaven. So there is logic not only to President John Adams’s affirmation in the Treaty of Tripoli in 1796 that “the government of the United States of America is not in any sense founded on the Christian religion” but also to the Supreme Court’s 1892 observation that “this is a Christian nation.”
  • Some surveys show that the portion of “Nones” (those who claim no religious preference) is rising in the United States—doubling by one account over the course of the 1990s from 7 percent to 14 percent. But those who have distanced themselves from organized religion have done nothing of the sort when it comes to God or spirituality. In a recent survey of US adolescents, sociologist Christian Smith found that, of the teenagers who claimed “no religion,” fewer than one out of five rejected the possibility of life after death. In a recent study of American adults, nine out of ten of the “no religion” respondents told researchers that they pray. These “Nones,” in short, are about as irreligious as your average nun. Few are Euro-style atheists or agnostics; the vast majority are “unchurched believers”—spiritual people who for one reason or another avoid religious congregations.
  • Of America’s religions, the most popular of course is Christianity. Half of Americans describe themselves as Protestants, one-quarter as Catholics, and 10 percent as Christians of some other stripe. This makes the US population more Christian than Israel is Jewish or Utah is Mormon.
  • The Gospel of John instructs Christians to “search the scriptures” (John 5:39), but little searching, and even less finding, is being done.
  • When religion is mentioned in US history schoolbooks, it is all too often an afterthought or an embarrassment (or both) and clearly a diversion from what is presumed throughout to be a secular story. Historian Jon Butler has called this the jack-in-the-box approach: Religious characters pop up here and there, typically with all of the color and substance of a circus clown, but their appearances—prosecuting witches in Salem in the 1690s or making monkeys of themselves at the Scopes Trial in Dayton, Tennessee, in the 1920s—are always a surprise (or a scare), and, happily, they go back into hiding as quickly as they emerge. Readers of American history textbooks might learn something about the religious bigotry of the Puritans and the quaint customs of Native Americans of bygone days.
  • none of the classic events in American history—the Revolution, the Civil War, the New Deal, the Reagan Revolution—can be understood without some knowledge of the religious motivations of the generals, soldiers, thinkers, politicians, and voters who made them happen.
  • When the war ended, both sides saw it as an Armageddon of sorts. Southerners fastened onto the Myth of the Lost Cause, which embraced Confederate soldiers as martyrs and the South as something of a resurrected Christ, while Northerners anointed Lincoln, who was assassinated on Good Friday, as a Christ of their own who shed his blood to atone for the sins of the nation.
  • Progressive proponents of the Social Gospel, by contrast, saw capitalism as a sin. The novel In His Steps (1897) by the Congregationalist minister Charles M. Sheldon is remembered today for bequeathing to us the query “What would Jesus do?” but its original purpose was to drive home the point that if Jesus were out and about in Victorian America he would be caring for slum dwellers, not selling steel.
  • Partisans of “muscular Christianity,” recalling that Jesus “came not to send peace, but a sword” (Matthew 10:34), contended that their “manly Redeemer” would want them to fight for what is right. Christian pacifists, who worshipped a “sweet Savior,” countered with the story of Jesus rebuking followers after they drew blood from his captors in the Garden of Gethsemane (Matthew 26:51–52).
  • also affects Indian tourism (since some high-caste Hindus consider traveling outside of India polluting), AIDS in Africa (where the Roman Catholic Church forbids artificial birth control), and banking throughout the Muslim world (since Islamic law prohibits the giving and receiving of interest).
  • As a series of recent Supreme Court rulings has made plain, the First Amendment requires that the public schools be neutral with respect to religion. That means not taking sides among the religions, not favoring Christianity over Buddhism, for example, or the Baptists over the Lutherans. But it also means not taking sides between religion and irreligion. As Justice Tom Clark wrote in Abington v. Schempp (1963), public schools may not preach the “religion of secularism.”
  • As Nord noted, “For some time now, people have rightly argued that ignoring black history and women’s literature (as texts and curricula have traditionally done) has been anything but neutral. Rather, it betrays a prejudice; it is discriminatory. And so it is with religion.”
  • the Court has repeatedly and explicitly given a constitutional seal of approval to teaching about religion “when presented objectively as part of a secular program of education.”
  • With the American Revolution came a new rationale for basic literacy, and a new aim. Whereas the revolution of Luther and Calvin had provided a theological justification for reading, the revolution of Washington and Adams provided a civic one. Now children needed to read not only to be good Protestants but also to be good citizens—to free themselves from the tyranny of popes as well as kings. The theory here was simple, and it was rooted in a shared sense of the fragility of democratic government. Unlike European monarchies, which saw educated citizens as a bother at best, the American experiment in republican government, which vested sovereignty in the people and, by the 1820s, extended suffrage without regard to economic means (though, it must be noted, still in regard to race and sex), depended for its survival on an informed citizenry. Or, as James Madison put it, “A people who mean to be their own Governors, must arm themselves with the power which knowledge gives.”96 And so two potent justifications for literacy developed side by side. Children would learn to read both to free themselves from sin and to liberate themselves from monarchs—both to save their souls and to save the republic.
  • One of the myths of American education is that once upon a time (that is, before the Religious Right started to muck around in the public schools) public education was secular. This is simply not so. From their early-nineteenth-century beginnings, common schools were very much a part of an unofficial yet powerful Protestant establishment, which included the leading Protestant denominations and a “Benevolent Empire” of nondenominational voluntary associations dedicated to improving the world through peace, temperance, abolitionism, and other social reforms.
  • This famine was particularly worrying in light of the feast of secular novels and other “vicious literature” available on the frontier. Pioneers seemed to expend the limited reading skills they possessed on literature that amused rather than edified. As a result, the masses on the frontier were left “in the grossest darkness and spiritual ignorance,” “destitute”
  • historians Jon Roberts and James Turner have observed, state institutions were if anything more explicitly theological than their private counterparts “since they answered to electorates deeply suspicious even of Catholics, much more of outright unbelievers.” As late as 1905 a study of religion at state universities would conclude that these institutions were “more intensely and genuinely Christian than the average community.”
  • revivalism made Christians. In fact, it made converts by the millions. Church membership rates more than doubled from roughly 17 percent of Americans at the start of the Revolution to 34 percent in 1850.
  • The key figure behind the nondenominational or nonsectarian solution to the problem of religion in public education was Horace Mann, the education reformer (and, not coincidentally, Unitarian) who served as the secretary of the Massachusetts Board of Education from its founding in 1837 until 1848. More than anyone else, Mann determined the role religion would play in the nation’s public schools.
  • the focus of education shifted from teaching religious doctrines to inculcating moral character. The great exodus of religion from the minds of American citizens was under way.181 The effects of this exodus remain with us today, notably in our collapsing of religion into “values” and “values” into sexual morality, which in turn functions via an odd sort of circular reasoning as a proxy for religiosity. At least in popular parlance, what makes religious folks religious today is not so much that they believe in Jesus’ divinity or Buddhism’s Four Noble Truths but that they hold certain moral positions on bedroom issues such as premarital sex, homosexuality, and abortion.
  • “Religion prospered while theology slowly went bankrupt.” Once upon a time, the sermon had educated parishioners about such Christian staples as the Trinity and the Ten Commandments, and the stories ministers told from the pulpit were restricted to the grand biblical narratives of Moses, Abraham, Sarah, Jesus, and Mary. Over the course of the nineteenth century, however, the sermon descended, as Hofstadter put it, “from the vernacular to the vulgar”; the pew became a place where you could hear the likes of Moody fuming that “an educated rascal is the meanest kind of rascal”
  • What made you a Christian, both conservatives and liberal Protestants argued, was not affirming a particular catechism or knowing certain Bible stories; rather, what made you a Christian was having a relationship with an astonishingly malleable Jesus—an American Jesus buffeted here and there by the shifting winds of the nation’s social and cultural preoccupations.
  • 4 Gospels. The four narratives of the life of Jesus included in the New Testament of the Christian churches. They are: Matthew, Mark, Luke, and John.
  • 5 Ks. Symbols that identify male members of a Sikh order called the Khalsa, so called because each begins in Punjabi with the letter k. They are: kes, uncut hair; kangha, comb; kirpan, ceremonial sword; kara, steel wrist bangle; kachh, short pants.
  • 5 Pillars of Islam. The key practices of Islam, obligatory for all Muslims. They are: Shahadah, or witnessing that “There is no god but God, and Muhammad is the messenger of God”; salat, or prayer in the direction of Mecca five times a day (dawn, noon, afternoon, sunset, and evening); sawm, or fasting (from sunrise to sunset) during the lunar month of Ramadan; zakat, or almsgiving to the poor (via an asset tax); hajj, or pilgrimage to Mecca, once in a lifetime for all who are physically and financially able.
  • ahimsa. Term in Hinduism, Buddhism, and especially Jainism, often translated as nonviolence, referring to not harming or wishing to harm. Described by Jains as the highest moral duty,
  • Jesus repeatedly told his followers that he had come not to strengthen families but to set family members against one another: “If any man come to me, and hate not his father, and mother, and wife, and children, and brethren, and sisters, yea, and his own life also,” he said, “he cannot be my disciple” (Luke 14:26).
  • just-war theory. Catholic tradition, dating to Thomas Aquinas, describing both what makes a war just (jus ad bellum) and what conduct is justifiable during such a war (jus in bello). Concerning how to conduct a war, just war theorists often cite such principles as “discrimination” (which says that combatants should direct their aggression against other combatants rather than innocent civilians) and “proportionality” (which says that force cannot be out of proportion to the injury suffered). Just war theory also prohibits torture and mandates proper care for prisoners of war.
  • Mormons recognize four scriptures: the Bible (“as far as it is translated correctly”), the Book of Mormon, Pearl of Great Price, and Doctrine and Covenants.
  • The best way for newcomers to read the Quran is not from front to back but from back to front. Start with the Fatihah, but then skip to the shorter, more theological suras in the back. Then read the narratives of the prophets (toward the middle) before concluding with the legalistic content of the long suras at the front.
  • Like Muslims, Sikhs are strict monotheists who emphasize divine sovereignty. They reject the view that God incarnates in human form, believing instead in a formless God that can be known through singing and meditation. Sikhs too have a sacred center, in this case the Golden Temple of Amritsar, India. Like Hindus, Sikhs believe in karma and reincarnation.
Posted in Lit Review | Tagged , , , | Leave a comment

A (Very) Short History of Life on Earth

Title: A (Very) Short History of Life on Earth: 4.6 Billion Years in 12 Pithy Chapters

Author: Henry Gee

Completed: August 2023 (Full list of books)

Overview: This was an unusual book to me. It started off covering vast swaths of early prehistory at breakneck speed. Often only discussing one species long enough to note the name, what it evolved from, and what species followed after it. It was reminiscent of sections of Genesis (Adam begat Seth; Seth begat Enos; etc). At this stage, I almost stopped reading, which I rarely do. I’m glad I opted to continue on. Several of the later chapters were much more interesting to me. The last chapter also looked ahead in a way that was both reassuring and alarming at once. He argues that we need to continue to all we can to fight climate change and our impact on the environment, but no matter what we do, humans will be extinct in 1000 years. After that, the Earth will quickly (geologically speaking) wipe away any trace we were ever here. The planet doesn’t need “saving”; it will be fine. Life will continue. There will be no people or most of the species we currently know, but that’s the way life has been for billions of years, constantly changing.

Highlights:

  • Stromatolites—as we have seen, the first visible signs of life on Earth—were colonies of different kinds of bacteria. Bacteria can even swap portions of their own genes with one another. It is this easy trade that means, today, that bacteria can evolve resistance to antibiotics. If a bacterium doesn’t have a resistance gene for a particular antibiotic, it can pick it up from the genetic free-for-all of other species with which it shares its environment.
  • Sponges have no distinct organs or tissues. A live sponge pushed through a sieve and back into the water will pull itself together into a different shape but one just as alive, just as functional. It is a simple life that requires little energy—and little oxygen.
  • Some of the armored heirs of Saccorhytus created their own distinctive suits of chain mail, each link sculpted from a single crystal of calcite. In doing so they became the echinoderms—the spiny-skinned ones—the ancestors of the starfishes and sea urchins of today. All modern echinoderms have a distinctive body shape based on the number five, entirely different from any other animal. In the Cambrian, however, their shapes were more varied. Although some were bilaterally symmetrical, a few were triradial (that is, with a symmetry based on the number three), and yet others were completely irregular.
  • The evolution of the seed, like the evolution of the amniote egg, allowed plants to break away from the tyranny of water.
  • As pterosaurs evolved, they grew, until the last of their kind—at the end of the Cretaceous period—were as large as small airplanes and barely flapped at all. Light in build but with enormous wings, all they needed to do to take off was to spread their wings into a light breeze and physics would do the rest. Their success was abetted by a delicate construction, their skeletons modified into rigid, boxy airframes made of bones hollowed almost to paper-thinness.
  • But dinosaurs also excelled at being small. Some were so small they could have danced in the palm of your hand. Microraptor, for example, was the size of a crow and weighed no more than a kilogram; the peculiar, bat-like Yi, diminutive in name as well as size, weighed less than half that.
  • A sizable beneficiary was the liver, which generated a lot of heat and, in a large dinosaur, was the size of a car. The air-cooled internal workings of dinosaurs were more efficient than the liquid-cooled mammalian version. This allowed dinosaurs to become much larger than mammals ever could, without boiling themselves alive.
  • Although vertebrates in general have always laid eggs—a habit that allowed the final conquest of the land by the first amniotes—many vertebrates have reverted to the ancestral habit, found in the earliest jawed vertebrates, of bearing live young. It is all a matter of finding a strategy that protects the offspring without incurring too onerous a cost on the parent. Mammals started by laying eggs. Almost all of them became live-bearers, but at terrible cost. Live-bearing demands vast expenditures of energy, and this sets limits on the sizes that mammals can achieve on land.16 It also limits the number of offspring they can produce at once.
  • Madagascar, then as now, was a haven for the exotic. In the Cretaceous, many ecological niches there, even vegetarianism, were occupied by crocodiles.40
  • The dinosaurs’ card had been marked long before. Around 160 million years ago, in the late Jurassic, a collision in the distant asteroid belt produced the forty-kilometer-diameter asteroid now known as Baptistina, along with a magazine of more than a thousand fragments, each more than a kilometer across, some much larger. These harbingers of doom dispersed into the inner solar system.
  • We humans, at least in childhood, can hear notes as high as 20 kHz, much higher than the highest song of the skylark.6 But humans are cloth eared compared with many other mammals, such as dogs (45 kHz7), ring-tailed lemurs (58 kHz8), mice (70 kHz9), and cats (85 kHz10), and they are profoundly deaf compared with dolphins (160 kHz11). The evolution of the chain of three bones in the mammalian middle ear opened up to mammals an entirely new sensory universe inaccessible to other vertebrates.
  • When the backbone evolved half a billion years ago, it was a structure held horizontally, in tension. In hominins, it moved through ninety degrees, to be held vertically, in compression. No more radical alteration in the engineering requirements of the backbone has happened since it first evolved, and it can only be regarded as maladaptive; witness that back problems constitute one of the most costly and frequent causes of illness in humans today. Dinosaurs made a huge success of being bipeds but did so in a different way; they held their backbones horizontally, using their long, stiff tails as counterbalances. But hominins, like apes, have no tails and achieved bipedality the hard way.
  • they took a step that would be as revolutionary as standing upright had been to their now-distant forest ancestors: they learned how to run.
  • Half a million years ago, Britain was buried under ice a mile thick. In contrast, the climate was so warm 125,000 years ago that lions hunted deer on the banks of the Thames, and hippos wallowed as far north as the River Tees. Forty-five thousand years ago, Britain was a treeless steppe where reindeer roamed in winter and bison in summer.4 Twenty-six thousand years ago, it was too cold even for reindeer.
  • The hand axe is so distinctive because it has more or less the same design wherever it is found, irrespective of its age or the material from which it is made. Its association with a particular species—Homo erectus—suggests that hand axes, for all their undeniable beauty, were made according to a hardwired, stereotypical design. They were created as thoughtlessly as birds make their nests. If, when creating a hand axe, the maker made a mistake in the sequence of strokes required to chip it from a blank flint, they would not try to fix it or perhaps turn it to some other purpose. They would simply discard the mistake and start again from the beginning with a fresh blank.
  • Within the next few thousand years, Homo sapiens will have vanished. The cause will be, in part, the repayment of an extinction debt, long overdue. The patch of habitat occupied by humanity is nothing less than the entire Earth, and human beings have been making it progressively less habitable. The main reason, though, will be a failure of population replacement. The human population is likely to peak during the present century, after which it will decline. By 2100, it will be less than it is today.
Posted in Lit Review | Tagged , , , , | Leave a comment

A Prayer for the Crown-Shy

Title: A Prayer for the Crown-Shy: A Monk and Robot Book

Author: Becky Chambers

Completed: July 2023 (Full list of books)

Overview: I read the first in this series, A Psalm for the Wild-Built, last year and really enjoyed it. This is the second one and it came out shortly after I finished the first. Again, the author builds such an amazing world, I’m ready to move there. She brings up interesting philosophical questions that are fun to think about, but mostly I just want to sit down in these places and hang out the the characters.

Highlights:

  • “Nobody should be barred from necessities or comforts just because they don’t have the right number next to their name.”
  • A river-build, as it happened, was whatever its creator wanted to make out of whatever they had on hand.
  • “On the contrary, our way of life shows you how comfortable the world is on its own. Paring things down makes the small comforts all the sweeter. You don’t know how to be grateful for a well-sealed wall if you haven’t had a winter storm bust through a weak one. You don’t know how sweet strawberries are unless you’ve waited six months for them to fruit. Elsewhere, they have all these little luxuries, but they don’t understand that food and shelter and company are all you really need.
  • You don’t have to have a reason to be tired. You don’t have to earn rest or comfort. You’re allowed to just be.
  • “All parasites have value, Sibling Dex. Not to their hosts, perhaps, but you could say the same about a predator and a prey animal. They all give back—not to the individual but to the ecosystem at large. Wasps are tremendously important pollinators. Birds and fish eat bloodsucks.”
Posted in Lit Review | Tagged , , | Leave a comment

The Rules We Break

Title: The Rules We Break: Lessons in Play, Thinking, and Design

Author: Eric Zimmerman

Completed: July 2023 (Full list of books)

Overview: I expected this book to be more about design in general. I knew games would be the focus, but there was little else. That said, no book since Reality is Broken has gotten me more excited about playing games. There are a bunch of game ideas that you can play or use as a jumping off point for game design. There are some interesting ideas related to education, including a general dislike of “gamification” as it is often used to put a thin veneer of “fun” onto otherwise didactic lessons. This reminds me of exhibit design at science centers. In both cases, starting with a focus on “what they will learn” almost guarantees no one will enjoy the exhibit or game. Rather we should focus on what they will do. Given enough experiences of a phenomenon, people will develop an understanding of the basic principles which can evolve as they gain other, deeper experiences.

Highlights:

  • the twenty-first century might be termed a Ludic Century, an epoch in which games and play are the model for how we interact with culture and with each other.
  • More than just goofing around, play is a profound way of understanding how we relate to each other, how meaning is made, and how to critically engage with the world.
  • When you’re designing, the tendency is to spend a lot of time discussing ideas and concepts. Instead, design by doing. Get to the point where you are making something interactive as soon as possible. Don’t talk about a story: tell a story. Don’t theorize about the experience: actually build it. Put together a prototype.
  • The third stage begins around age ten. Children come to see Marbles as a social contract, a set of rules that gain their authority only because the players agree to follow them. This means that if everyone agrees, the rules can be changed. This is essentially how adults see games too: as a voluntary, social construct. Play in this sense is wonderfully flexible but also quite fragile. Play happens only if and when we all agree to it.
  • For Piaget, the game of Marbles was a lens for exploring how our morals develop. For DeKoven, the play community is a chance to practice being better people together. These are not just abstract ideas. Every moment of play is an opportunity to exercise collaboration with other human beings and to explore the curious social contract of play.
  • Think about: Enjoying the rules I used to hurry through explaining the rules of a game like Five Fingers, with the idea of getting everyone playing as quickly as possible. I have learned from grandmaster of play Bernie DeKoven to slow down and enjoy the performance of explaining rules. Take your time. Be clear and repeat key ideas. Make jokes. Build suspense. Have fun.
  • Any simulation is a statement about the mechanics of reality. Crucially, what you exclude is as important as what you decide to include.
  • Part of what defines a game is the goal. An outcome. Winning or losing, or receiving a score. A final goal certainly doesn’t have to be part of every kind of play, yet in a very traditional sense, a goal is what makes a game a game. Friends can casually ski down a slope together, or they can race against the clock to see who can get a faster time. Game designer Frank Lantz calls the goal of a game a kind of gravity. It orders events, letting you know which way is up and which way is down. Without a goal, how do you know that a move was good or bad? How can you decide what you should do next? In many kinds of play, we invent our own goals.
  • Field-defining scholars who study games and learning, like James Gee, Constance Steinkhuler, and Kurt Squire, view any well-designed game as intrinsically valuable. Games can engender communities of learning, they can help us think rigorously, they provide contexts where we can learn how to learn. Yet the gamification of education too often treats games like injectors of curricular information: games exist to deliver data more efficiently. This approach replicates today’s unfortunate trend of test-driven factual knowledge—the absolutely lowest form of learning —and has nothing to do with how education can address the complex challenges facing the world today.
  • “Where do you get your ideas?” This is a question an artist or designer or other creative person is often asked when they talk about what they do. Too often. The question implies that the most important moment in the life cycle of a project is the moment of conception, the mythical instant of genius when the answer appears in the creator’s mind, and that the rest of the work after that is just fleshing out the details. There’s just one problem: that story is dead opposite of reality. Why? Because it ignores what is actually far more important: the process. An initial idea is just a starting point. The hard work, the real work, the place of discovery and creativity, is each step along the way. I’m not the first to say it: ideas are cheap. Everyone has good ideas.
  • How do people get a genuine sense of real authorship over what they do? The answer is brilliantly simple: don’t fake it. They need to have actual, real authorship. The only way to make someone feel like they are making important decisions is to let them make important decisions. This is the opposite of a more traditional approach, in which a lead designer or creative director carries the vision for a project and has final approval over everything. Giving everyone on a team autonomy sounds scary, and it is! Yet it’s the only way to really get everyone’s heads in the game. So how does this not devolve into anarchy? The first step is to clearly define everyone’s roles and responsibilities—and to give them actual autonomy within that clearly defined role.
  • The practice of making something by trying out an early version of it is at the heart of iterative design. Playtesting goes by many names: editing, rehearsal, modeling. It is a methodology that can be applied to just about any field and any kind of project. Question: When do you start playtesting? Answer: Before you think you are ready. If you feel totally comfortable sharing your work in progress, it’s probably too late! Playtest as early as you can so that there is still time to make changes based on any findings you discover.
  • To break rules requires knowing what the rules are and giving yourself permission to leave them behind. This approach to creativity comes down to two things: having literacy about what already exists and feeling the freedom to go beyond it.
  • a great example of hardcore creativity, consider Chindōgu, the “art of useless inventions.” It has produced such absurd objects as foot umbrellas to keep shoes dry and a chopsticks-mounted fan for cooling down hot soup noodles. In fact the now-ubiquitous selfie stick first appeared as a Chindōgu invention in the 1990s—twenty years before it caught on! Kenji Kawakami invented Chindōgu and its philosophy, embodied in ten principles, including: Chindōgu are tools for everyday life; Chindōgu are not for sale; Inherent in every Chindōgu is the spirit of anarchy. Chindōgu means embracing these sometimes contradictory, rule-breaking rules.
  • Criticism is like spicy food. It is painful at first, but then you develop a taste for it. Eventually, you can’t get enough. The goal is to inculcate that craving to give and receive feedback—hard-hitting, helpful, sensitive, insightful feedback—in yourself and your collaborators.
  • I sometimes get into arguments with friends who are worried about teaching. They feel like they haven’t yet accumulated sufficient knowledge or expertise in order to instruct others. That’s hogwash! In fact, they have the entire idea of teaching exactly backward. They shouldn’t teach because they have arrived at some kind of endpoint. They should teach to learn about something, to get better at it, and to investigate it more deeply.
Posted in Lit Review | Tagged , , , | 1 Comment

A Complaint Free World

Title: A Complaint Free World: How to Stop Complaining and Start Enjoying the Life You Always Wanted

Author: Will Bowen

Completed: July 2023 (Full list of books)

Overview: I like the idea presented here, but this book was too woo-woo for me. The very short version is that we all complain too much and using a bracelet (or some other moveable token) to track how often we complain can help us reduce our frequency of complaints. There’s a lot of evidence that complaining less and being more grateful is healthy for us and noticing when we do something is the first step to managing it.

I went into this book knowing that Will uses purple bracelets to monitor his complaining to help him reduce it. I had hoped the book would cover strategies others have used to be successful, but the only strategy I got from the book was to wear a bracelet and move it every time you complain. I’m going to try wearing and moving a bracelet, but wouldn’t recommend reading the book.

Highlights:

  • “What you Articulate, you Demonstrate!”
  • “The universe is change; our life is what our thoughts make it.” — MARCUS AURELIUS
  • There are two things upon which most people will agree: There is too much complaining in the world. The state of the world is not the way we would like it. In my opinion, there is a correlation between the two. We are focusing on what is wrong rather than focusing our vision on a healthy, happy, and harmonious world.
  • Of all the self-fulfilling prophecies in our culture, the assumption that aging means decline and poor health is probably the deadliest.
  • This explains why I lump gossip in with complaining. Am I opposed to gossip? Absolutely not. As long as: What you’re saying about the absent person is complimentary. You would repeat, word for word, what you are saying if the absent person were present. If you can follow those two simple rules, gossip all you want.
  • You might wonder, “When is what I’m saying a complaint and when is it just a statement of fact?” According to Dr. Robin Kowalski, “Whether or not the particular statement reflects a complaint … depends on whether the speaker is experiencing an internal dissatisfaction.”* The words in a complaint and a noncomplaint can be identical; what distinguishes the two is your meaning, your energy behind them.
  • It’s a complaint if you want the person or situation changed. If you want it other than how it is, it’s a complaint and not just a statement of fact
  • In The Tazy Man’s Guide to Enlightenment, Thaddeus Golas summed it up: “Loving yourself is not a matter of building up your ego. Egotism is proving you are worthwhile after you have sunk into hating yourself. Loving yourself will dissolve your ego: you will feel no need to prove you are superior.”
  • When these true “peace talks” occur, the rules will be simple. Rather than talking about what is going on in the present or what has happened in the past, the focus will remain exclusively on what it will be like when there is no more acrimony between them. They might ask, “What will peace between us look, feel, sound, and smell like? What will it be like when war and disagreement between us is such a distant memory we would have to consult history books because such a time is lost to us?”
  • I realized that before we adopted a Complaint Free lifestyle, I was teaching Lia that being at the family dinner table was a time to gripe and gossip. I was modeling for her that this is how people act. I’m so grateful now that our supper table is where we talk about blessings and bright vistas. This is what I want to pass on to her so she’ll model this for her children and their children after them. Let family time be joyous and happy, not a time to vent about how things didn’t go your way that day.
  • “How can I help bring about positive change if I don’t complain?” Again, change begins with dissatisfaction. It begins when someone like you sees a gap between what is and what can be. Dissatisfaction is the beginning, but it can’t be the end. If you complain about a situation, you may be able to draw others to you who will bellyache along with you, but you won’t be able to get much done. However, if you can begin to speak in terms of what it will be like when the challenge no longer exists, when the bridge is gapped, when the problem is solved, then you can excite and move people to positive change.
  • In his play Fiction, one of Steven Dietz’s characters remarks, “Writers don’t like to write; they like to have written.” Similarly, people don’t like to change, but they like to have changed.
  • just to be clear, I am not advocating remaining silent when there is something that has happened which you need corrected. Don’t hold back, don’t hold it in, just make sure you are only stating the facts and not putting any “how dare you do this to me!” energy behind what you are saying.
  • There is no ego in telling the waiter your soup is cold and needs to be heated up—if you stick to the facts, which are always neutral. “How dare you serve me cold soup … ?” That’s complaining.
  • Directing a comment to someone who can improve your situation is not complaining. Berating someone or lamenting the situation either to yourself or to another is complaining.
Posted in Lit Review | Tagged , , | Leave a comment

My “Beer on the Run” Interview

I was interviewed last week for my friend’s podcast, Beer on the Run. Usually Jack and Clint talk with ultra runners about their races and what beer they like. According the UltraSignUps, I’ve only done two ultramarathons which is a lot less than most people they interview, but that doesn’t count my 50k DNF or any of the Birthday Bashes. Either way, we talked a bit about running and spent more time talking about the Seventy48 that I did solo last year and this year with my dad (review of this year coming soon).

We also discuss some life lessons from The Comfort Crisis, what it’s like running with a kid in a jogger, my favorite and most unique homebrews, finding weird challenges that aren’t races, and other random topics that came up. It drew together many different aspects of my life into one (mostly) coherent hour-long chat. I hope you enjoy the conversation.

If you like this format, you can find all 118+ episodes of Beer on the Run on Stitcher, Spotify, and all the other podcast places.

Posted in Adventures, Running | Tagged , , , , | Leave a comment

If I Understood You, Would I Have This Look on My Face?

Title: If I Understood You, Would I Have This Look on My Face?: My Adventures in the Art and Science of Relating and Communicating

Author: Alan Alda

Completed: July 2023 (Full list of books)

Overview: This book boils down to a few points and a bunch of stories (many involving famous people), about problems with communication and potential solutions. Although a lot of it focuses on science communication and his work the Alan Alda Center for Communicating Science at Stony Brook, most of the points would improve communication in general. According to Alda, empathy is king when it comes to communication. The presenter must connect with the audience. He sees acting and specifically improv as a way to build empathy with the audience and quotes research to back that up. He also points out anything else that allows people to connect from walking/marching in sync with others to finding similar interests builds empathy and improves communication. Finally, he talks about the power of stories. Throughout this book, he showed his believe in the power of story-telling with anecdotes for every point he makes. Stories can certainly be a useful tool to draw the audience in… even if I didn’t use them here.

Highlights:

  • If they could understand these things, why couldn’t I? An accountant would tell me about the tax code in a way that made no sense. A salesman would explain an insurance policy that didn’t seem to have a basis in reality. It wasn’t any consolation when I came to realize that pretty much everybody misunderstands everybody else.
  • After a while, I saw that I was having trouble talking with them whenever I thought I knew more than I really did about their work. I was boxing in the scientists with questions that were based on false assumptions. I took a bold step and stopped reading the scientists’ research papers before I met with them. I would come in armed only with curiosity and my own natural ignorance. I was learning the value of bringing my ignorance to the surface. The scientists could see exactly how much I already understood, and they could start there. Ignorance was my ally as long as it was backed up by curiosity. Ignorance without curiosity is not so good,
  • I saw how the pull of formality and jargon can yank someone into not relating.
  • This is a natural stage in a child’s development. In fact, it’s not until about the age of four or five that it even occurs to children that deception is possible. There’s no point in lying if everybody knows what you’re thinking! But once Theory of Mind develops in a child, it also becomes clear that others might be lying to you, and it’s kind of important to know what’s going on inside these other people’s heads.
  • The person who’s communicating something is responsible for how well the other person follows him. If I’m trying to explain something and you don’t follow me, it’s not simply your job to catch up. It’s my job to slow down. This is at the heart of communicating: If I tell you something without making sure you got it, did I really communicate anything?
  • The simple act of walking in step produced greater cooperation? And more trust? It seemed hard to believe. But the tests of the groups’ cohesion were standardized. Their reliability had been verified many times over. In other studies, simply tapping in sync, like tapping on a table, produced the same results. After they had spent some time tapping in sync, the subjects paid more attention to the good of the group, and they made fewer selfish choices.
  • Someone might be selling a product to the class, but the class has to work hard to guess what the product is, because the sales pitch is entirely in gibberish—nonsense sounds that sound like a language but have no meaning.
  • Woolley’s group gathered 697 volunteers and divided them into small teams of two to five members each. They tested them at a number of tasks and found that the average intelligence of a group could not significantly predict the group’s performance. What could predict it, though, were three factors: the ability of the members of the group to freely take part in discussions, members’ scores on a standardized test of empathy, and, surprisingly, the presence of women in the group.
  • studied the fifteen hundred S&P firms over a fifteen-year period and found that when they had women in the top managerial positions, the firms were more successful. Interestingly, firms that had a strategy of innovation enjoyed the most success (but those that had a less innovative strategy did no worse having women at the helm). The authors suggest that the presence of women in top management positions helps specifically in situations where the focus is innovation. This, they say, is because women’s social skills lead in part to better decision making overall and also because studies have shown that “gender diversity in particular facilitates creativity.”
  • He struck a chord when he wrote about the pitfalls of assuming students are totally responsible for their own motivation, noting that “this can lead researchers to blame group members for their lack of motivation.” Instead, he feels that it’s up to the leader of the group to motivate the students, or else things can break down.
  • All of this suggests to me that an inescapable product of improvisation is empathy.
  • On the one hand, you can command good performance from someone in exchange for not firing them. On the other hand, you might be able to ignite the desire in a person to perform well by tuning in to their state of mind. And, in fact, this has been shown by research to be the better way.
  • Instead of saying, “You’ve done a bad thing; don’t do it again,” he’s saying, “You’ve done really good things; do more.” The first gives them a vision of failure they somehow have to avoid, while the second gives them a model of success to live up to. The CEO I was having lunch with might not have realized it, but he was following the improv principle of Yes And. He was accepting what the other player was giving him and adding to it.
  • not only is one of our core assumptions that a deep awareness of the other person is at the heart of good communication, but we also believe that empathy can be increased.
  • researchers wondered if experience in acting would lead to growth in empathy and Theory of Mind. To test this hypothesis, they did two studies, one on elementary school students and the other on high school freshmen. All the students were given standard empathy and Theory of Mind tests before and after the training, to see if the training had any effect. It did. Both groups of students trained in acting showed significant gains in their empathy scores. Adolescents showed even more progress than younger kids: They had significant gains not only in empathy, but in a test of Theory of Mind, as well. Control groups that had been given other kinds of arts training, such as training in music or visual arts, showed no such improvement. Only theater training did it.
  • The stereotypical view of empathy is that it makes you soft, that you have to abandon it if you need to be tough. On the contrary, when you have to be tough—or even if you choose to be cruel—empathy can be a useful tool. It doesn’t necessarily make you a nice guy.
  • If the first principle of teaching is to start with what they know, I think the Flame Challenge suggests that next in importance is that a little autonomy can give students the joy of discovery.
  • “So, how can we excite emotions in people who have no training in what we’re talking about?” I asked her. “Story,” she said.
  • the trouble with a lecture is that it answers questions that haven’t been asked.
  • We can identify with someone who has a goal, but we root for someone with both a goal and an obstacle.
Posted in Education, Lit Review | Tagged , , , | Leave a comment