Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
134003 stories
·
29 followers

OpenAI rebrands itself

1 Share
An image showing OpenAI’s new typeface
OpenAI’s wordmark uses the company’s updated typeface, OpenAI sans.

OpenAI just gave itself a full rebrand, complete with a new typeface, logo, and color palette, as explained to Wallpaper in an interview about the process behind the changes. You’ll have to look closely to spot the difference between the redrawn logo and its old one, but a side-by-side comparison shows the updated “blossom” with a slightly larger space in the center and cleaner lines.

Though the original logo was designed by OpenAI CEO Sam Altman and co-founder Ilya Sutskever, an in-house design team led by Veit Moeller and Shannon Jager took the reins this time around, intending to create a “more organic and more human” identity, Wallpaper reports.

OpenAI’s old logo (left) vs. its new one (right). Image: OpenAI

As part of the rebrand, OpenAI showed off a new typeface — called OpenAI sans — that it says “blends geometric precision and functionality with a rounded, approachable character.” It now uses this typeface in the OpenAI wordmark, which features an “O” with a perfectly round exterior and an imperfect interior “to counter any robotic precision and make things feel more human,” Moeller said, according to Wallpaper.

When asked whether OpenAI used the company’s AI-powered tools like ChatGPT to create the designs, Moeller told Wallpaper that the team only used it to help calculate different type weights.

“We collaborate with leading experts in photography, typography, motion, and spatial design while integrating AI tools like DALL·E, ChatGPT, and Sora as thought partners,” OpenAI’s designers told Wallpaper. ‘This dual approach — where human intuition meets AI’s generative potential— allows us to craft a brand that is not just innovative, but profoundly human.”

Read the whole story
alvinashcraft
7 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Uno Platform Wants Microsoft to Improve .NET WebAssembly in Two Ways

1 Share
Uno Platform, a third-party dev tooling specialist that caters to .NET developers, published a report on the state of WebAssembly, addressing some shortcomings in the .NET implementation it would like to see Microsoft address.
Read the whole story
alvinashcraft
7 hours ago
reply
Pennsylvania, USA
Share this story
Delete

The End of Programming as We Know It

1 Share

There’s a lot of chatter in the media that software developers will soon lose their jobs to AI. I don’t buy it.

It is not the end of programming. It is the end of programming as we know it today. That is not new. The first programmers connected physical circuits to perform each calculation. They were succeeded by programmers writing machine instructions as binary code to be input one bit at a time by flipping switches on the front of a computer. Assembly language programming then put an end to that. It lets a programmer use a human-like language to tell the computer to move data to locations in memory and perform calculations on it. Then, development of even higher-level compiled languages like Fortran, COBOL, and their successors C, C++, and Java meant that most programmers no longer wrote assembly code. Instead, they could express their wishes to the computer using higher level abstractions.


Betty Jean Jennings and Frances Bilas (right) program the ENIAC in 1946. Via the Computer History Museum

Eventually, interpreted languages, which are much easier to debug, became the norm. 

BASIC, one of the first of these to hit the big time, was at first seen as a toy, but soon proved to be the wave of the future. Programming became accessible to kids and garage entrepreneurs, not just the back office priesthood at large companies and government agencies.

Consumer operating systems were also a big part of the story. In the early days of the personal computer, every computer manufacturer needed software engineers who could write low-level drivers that performed the work of reading and writing to memory boards, hard disks, and peripherals such as modems and printers. Windows put an end to that. It didn’t just succeed because it provided a graphical user interface that made it far easier for untrained individuals to use computers. It also provided what Marc Andreessen, whose company Netscape was about to be steamrollered by Microsoft, dismissively (and wrongly) called “just a bag of drivers.” That bag of drivers, fronted by the Win32 APIs, meant that programmers no longer needed to write low-level code to control the machine. That job was effectively encapsulated in the operating system. Windows and macOS, and for mobile, iOS and Android, mean that today, most programmers no longer need to know much of what earlier generations of programmers knew.

There were more programmers, not fewer

This was far from the end of programming, though. There were more programmers than ever. Users in the hundreds of millions consumed the fruits of their creativity. In a classic demonstration of elasticity of demand, as software was easier to create, its price fell, allowing developers to create solutions that more people were willing to pay for.

The web was another “end of programming.” Suddenly, the user interface was made up of human-readable documents, shown in a browser with links that could in turn call programs on remote servers. Anyone could build a simple “application” with minimal programming skill. “No code” became a buzzword. Soon enough, everyone needed a website. Tools like WordPress made it possible for nonprogrammers to create those websites without coding. Yet as the technology grew in capability, successful websites became more and more complex. There was an increasing separation between “frontend” and “backend” programming. New interpreted programming languages like Python and JavaScript became dominant. Mobile devices added a new, ubiquitous front end, requiring new skills. And once again, the complexity was hidden behind frameworks, function libraries, and APIs that insulated programmers from having to know as much about the low level functionality that it was essential for them to learn only a few years before.

Big data, web services, and cloud computing established a kind of “internet operating system.” Services like Apple Pay, Google Pay, and Stripe made it possible to do formerly difficult, high-stakes enterprise tasks like taking payments with minimal programming expertise. All kinds of deep and powerful functionality was made available via simple APIs. Yet this explosion of internet sites and the network protocols and APIs connecting them ended up creating the need for more programmers.

Programmers were no longer building static software artifacts updated every couple of years but continuously developing, integrating, and maintaining long-lived services. Even more importantly, much of the work at these vast services, like Google Search, Google Maps, Gmail, Amazon, Facebook, and Twitter, was automated at vast scale. Programs were designed and built by humans, not AI, but much of the work itself was done by special-purpose predecessors to today’s general purpose AIs. The workers that do the bulk of the heavy lifting at these companies are already programs. The human programmers are their managers. There are now hundreds of thousands of programmers doing this kind of supervisory work. They are already living in a world where the job is creating and managing digital co-workers.


“Google, Facebook, Amazon, or a host of more recent Silicon Valley startups…employ tens of thousands of workers. If you think with a twentieth century factory mindset, those workers spend their days grinding out products, just like their industrial forebears, only today, they are producing software rather than physical goods. If, instead, you step back and view these companies with a 21st century mindset, you realize that a large part of the work of these companies – delivering search results, news and information, social network status updates, and relevant products for purchase – is done by software programs and algorithms. These are the real workers, and the programmers who create them are their managers.”—Tim O’Reilly, Managing the Bots That Are Managing the Business,” MIT Sloan Management Review, May 21, 2016

In each of these waves, old skills became obsolescent—still useful but no longer essential—and new ones became the key to success. There are still a few programmers who write compilers, thousands who write popular JavaScript frameworks and Python libraries, but tens of millions who write web and mobile applications and the backend software that enables them. Billions of users consume what they produce.

Might this time be different?

Suddenly, though, it is seemingly possible for a nonprogrammer to simply talk to an LLM or specialized software agent in plain English (or the human language of your choice) and get back a useful prototype in Python (or the programming language of your choice). There’s even a new buzzword for this: CHOP, or “chat-oriented programming.” The rise of advanced reasoning models is beginning to demonstrate AI that can generate even complex programs with a high-level prompt explaining the task to be accomplished. As a result, there are a lot of people saying “this time is different,” that AI will completely replace most human programmers, and in fact, most knowledge workers. They say we face a wave of pervasive human unemployment.

I still don’t buy it. When there’s a breakthrough that puts advanced computing power into the hands of a far larger group of people, yes, ordinary people can do things that were once the domain of highly trained specialists. But that same breakthrough also enables new kinds of services and demand for those services. It creates new sources of deep magic that only a few understand.

The magic that’s coming now is the most powerful yet. And that means that we’re beginning a profound period of exploration and creativity, trying to understand how to make that magic work and to derive new advantages from its power. Smart developers who adopt the technology will be in demand because they can do so much more, focusing on the higher-level creativity that adds value.

Learning by doing

AI will not replace programmers, but it will transform their jobs. Eventually much of what programmers do today may be as obsolete (for everyone but embedded system programmers) as the old skill of debugging with an oscilloscope. Master programmer and prescient tech observer Steve Yegge observes that it is not junior and mid-level programmers who will be replaced but those who cling to the past rather than embracing the new programming tools and paradigms. Those who acquire or invent the new skills will be in high demand. Junior developers who master the tools of AI will be able to outperform senior programmers who don’t. Yegge calls it “The Death of the Stubborn Developer.”

My ideas are shaped not only by my own past 40+ years of experience in the computer industry and the observations of developers like Yegge but also by the work of economic historian James Bessen, who studied how the first Industrial Revolution played out in the textile mills of Lowell, Massachusetts during the early 1800s. As skilled crafters were replaced by machines operated by “unskilled” labor, human wages were indeed depressed. But Bessen noticed something peculiar by comparing the wage records of workers in the new industrial mills with those of the former home-based crafters. It took just about as long for an apprentice craftsman to reach the full wages of a skilled journeyman as it did for one of the new entry-level unskilled factory workers to reach full pay and productivity. The workers in both regimes were actually skilled workers. But they had different kinds of skills.

There were two big reasons, Bessen found, why wages remained flat or depressed for most of the first 50 years of the Industrial Revolution before taking off and leading to a widespread increase of prosperity. The first was that the factory owners hoarded the benefits of the new productivity rather than sharing it with workers. But the second was that the largest productivity gains took decades to arrive because the knowledge of how best to use the new technology wasn’t yet widely dispersed. It took decades for inventors to make the machines more robust, for those using them to come up with new kinds of workflows to make them more effective, to create new kinds of products that could be made with them, for a wider range of businesses to adopt the new technologies, and for workers to acquire the necessary skills to take advantage of them. Workers needed new skills not only to use the machines but to repair them, to improve them, to invent the future that they implied but had not yet made fully possible. All of this happens through a process that Bessen calls “learning by doing.”

It’s not enough for a few individuals to be ahead of the curve in adopting the new skills. Bessen explains that “what matters to a mill, an industry, and to society generally is not how long it takes to train an individual worker but what it takes to create a stable, trained workforce” (Learning by Doing, 36). Today, every company that is going to be touched by this revolution (which is to say, every company) needs to put its shoulder to the wheel. We need an AI-literate workforce. What is programming, after all, but the way that humans get computers to do our bidding? The fact that “programming” is getting closer and closer to human language, that our machines can understand us rather than us having to speak to them in their native tongue of 0s and 1s, or some specialized programming language pidgin, should be cause for celebration.

People will be creating, using, and refining more programs, and new industries will be born to manage and build on what we create. Lessons from history tell us that when automation makes it cheaper and easier to deliver products that people want or need, increases in demand often lead to increases in employment. It is only when demand is satisfied that employment begins to fall. We are far from that point when it comes to programming.

Not unsurprisingly, Wharton School professor and AI evangelist Ethan Mollick is also a fan of Bessen’s work. This is why he argues so compellingly to “always bring AI to the table,” to involve it in every aspect of your job, and to explore “the jagged edge” of what works and what doesn’t. It is also why he urges companies to use AI to empower their workers, not to replace them. There is so much to learn about how to apply the new technology. Businesses’ best source of applied R&D is the explorations of the people you have, as they use AI to solve their problems and seek out new opportunities.

What programming is will change

Sam Schillace, one of the deputy CTOs at Microsoft, agreed with my analysis. In a recent conversation, he told me, “We’re in the middle of inventing a new programming paradigm around AI systems. When we went from the desktop into the internet era, everything in the stack changed, even though all the levels of the stack were the same. We still have languages, but they went from compiled to interpreted. We still have teams, but they went from waterfall to Agile to CI/CD. We still have databases, but they went from ACID to NoSQL. We went from one user, one app, one thread, to multi distributed, whatever. We’re doing the same thing with AI right now.”

Here are some of the technologies that are being assembled into a new AI stack. And this doesn’t even include the plethora of AI models, their APIs, and their cloud infrastructure. And it’s already out of date!


AI Engineering Landscape,” via Marie-Alice Blete on GitHub

But the explosion of new tools, frameworks, and practices is just the beginning of how programming is changing. One issue, Schillace noted, is that models don’t have memory the way humans have memory. Even with large context windows, they struggle to do what he calls “metacognition.” As a result, he sees the need for humans to still provide a great deal of the context in which their AI co-developers operate.

Schillace expanded on this idea in a recent post. “Large language models (LLMs) and other AI systems are attempting to automate thought,” he wrote. “The parallels to the automation of motion during the industrial revolution are striking. Today, the automation is still crude: we’re doing the cognitive equivalent of pumping water and hammering—basic tasks like summarization, pattern recognition, and text generation. We haven’t yet figured out how to build robust engines for this new source of energy—we’re not even at the locomotive stage of AI yet.”

Even the locomotive stage was largely an expansion of the brute force humans were able to bring to bear when moving physical objects. The essential next breakthrough was an increase in the means of control over that power. Schillace asks, “What if traditional software engineering isn’t fully relevant here? What if building AI requires fundamentally different practices and control systems? We’re trying to create new kinds of thinking (our analog to motion): higher-level, metacognitive, adaptive systems that can do more than repeat pre-designed patterns. To use these effectively, we’ll need to invent entirely new ways of working, new disciplines. Just as the challenges of early steam power birthed metallurgy, the challenges of AI will force the emergence of new sciences of cognition, reliability, and scalability—fields that don’t yet fully exist.”

The challenge of deploying AI technologies in business

Bret Taylor, formerly co-CEO of Salesforce, one-time Chief Technology Officer at Meta, and long ago, leader of the team that created Google Maps, is now the CEO of AI agent developer Sierra, a company at the heart of developing and deploying AI technology in businesses. In a recent conversation, Bret told me that he believes that a company’s AI agent will become its primary digital interface, as significant as its website, as significant as its mobile app, perhaps even more so. A company’s AI agent will have to encode all of its key business policies and processes. This is something that AI may eventually be able to do on its own, but today, Sierra has to assign each of its customers an engineering team to help with the implementation.

“That last mile of taking a cool platform and a bunch of your business processes and manifesting an agent is actually pretty hard to do,” Bret explained. “There’s a new role emerging now that we call an agent engineer, a software developer who looks a little bit like a frontend web developer. That’s an archetype that’s the most common in software. If you’re a React developer, you can learn to make AI agents. What a wonderful way to reskill and make your skills relevant.”

Who will want to wade through a customer service phone tree when they could be talking to an AI agent that can actually solve their problem? But getting those agents right is going to be a real challenge. It’s not the programming that’s so hard. It’s deeply understanding the business processes and thinking how the new capability can transform them to take advantage of the new capabilities. An agent that simply reproduces existing business processes will be as embarrassing as a web page or mobile app that simply recreates a paper form. (And yes, those do still exist!)

Addy Osmani, the head of user experience for Google Chrome, calls this the 70% problem: “While engineers report being dramatically more productive with AI, the actual software we use daily doesn’t seem like it’s getting noticeably better.” He notes that nonprogrammers working with AI code generation tools can get out a great demo or solve a simple problem, but they get stuck on the last 30% of a complex program because they don’t know enough to debug the code and guide the AI to the correct solution. Meanwhile:

When you watch a senior engineer work with AI tools like Cursor or Copilot, it looks like magic. They can scaffold entire features in minutes, complete with tests and documentation. But watch carefully, and you’ll notice something crucial: They’re not just accepting what the AI suggests…. They’re applying years of hard-won engineering wisdom to shape and constrain the AI’s output. The AI is accelerating their implementation, but their expertise is what keeps the code maintainable. Junior engineers often miss these crucial steps. They accept the AI’s output more readily, leading to what I call “house of cards code” – it looks complete but collapses under real-world pressure.

In this regard, Chip Huyen, the author of the new book AI Engineering, made an illuminating observation in an email to me:

I don’t think AI introduces a new kind of thinking. It reveals what actually requires thinking.

No matter how manual, if a task can only be done by a handful of those most educated, that task is considered intellectual. One example is writing, the physical act of copying words onto paper. In the past, when only a small portion of the population was literate, writing was considered intellectual. People even took pride in their calligraphy. Nowadays, the word “writing” no longer refers to this physical act but the higher abstraction of arranging ideas into a readable format.

Similarly, once the physical act of coding can be automated, the meaning of “programming” will change to refer to the act of arranging ideas into executable programs.

Mehran Sahami, the chair of Stanford’s CS department, put it simply: “Computer science is about systematic thinking, not writing code.”

When AI agents start talking to agents…

…precision in articulating the problem correctly gets even more important. An agent as a corporate frontend that provides access to all of a company’s business processes will be talking not just to consumers but also to agents for those consumers and agents for other companies.

That entire side of the agent equation is far more speculative. We haven’t yet begun to build out the standards for cooperation between independent AI agents! A recent paper on the need for agent infrastructure notes:

Current tools are largely insufficient because they are not designed to shape how agents interact with existing institutions (e.g., legal and economic systems) or actors (e.g., digital service providers, humans, other AI agents). For example, alignment techniques by nature do not assure counterparties that some human will be held accountable when a user instructs an agent to perform an illegal action. To fill this gap, we propose the concept of agent infrastructure: technical systems and shared protocols external to agents that are designed to mediate and influence their interactions with and impacts on their environments. Agent infrastructure comprises both new tools and reconfigurations or extensions of existing tools. For example, to facilitate accountability, protocols that tie users to agents could build upon existing systems for user authentication, such as OpenID. Just as the Internet relies on infrastructure like HTTPS, we argue that agent infrastructure will be similarly indispensable to ecosystems of agents. We identify three functions for agent infrastructure: 1) attributing actions, properties, and other information to specific agents, their users, or other actors; 2) shaping agents’ interactions; and 3) detecting and remedying harmful actions from agents.

There are huge coordination and design problems to be solved here. Even the best AI agents we can imagine will not solve complex coordination problems like this without human direction. There is enough programming needed here to keep even AI-assisted programmers busy for at least the next decade.

In short, there is a whole world of new software to be invented, and it won’t be invented by AI alone but by human programmers using AI as a superpower. And those programmers need to acquire a lot of new skills.

We are in the early days of inventing the future

There is so much new to learn and do. So yes, let’s be bold and assume that AI codevelopers make programmers ten times as productive. (Your mileage may vary, depending on how eager your developers are to learn new skills.) But let’s also stipulate that once that happens, the “programmable surface area” of a business, of the sciences, of our built infrastructure will rise in parallel. If there are 20x the number of opportunities for programming to make a difference, we’ll still need twice as many of those new 10x programmers!

User expectations are also going to rise. Businesses that simply use the greater productivity to cut costs will lose out to companies that invest in harnessing the new capabilities to build better services.

As Simon Willison, a longtime software developer who has been at the forefront of showing the world how programming can be easier and better in the AI era, notes, AI lets him “be more ambitious” with his projects.

Take a lesson from another field where capabilities exploded: It may take as long to render a single frame of one of today’s Marvel superhero movies as it did to render the entirety of the first Pixar film even though CPU/GPU price and performance have benefited from Moore’s Law. It turns out that the movie industry wasn’t content to deliver low-res crude animation faster and more cheaply. The extra cycles went into thousands of tiny improvements in realistic fur, water, clouds, reflections, and many many more pixels of resolution. The technological improvement resulted in higher quality, not just cheaper/faster delivery. There are some industries made possible by choosing cheaper/faster over higher production values (consider the explosion of user-created video online), so it won’t be either-or. But quality will have its place in the market. It always does.

Imagine tens of millions of amateur AI-assisted programmers working with AI tools like Replit and Devin or enterprise solutions like those provided by Salesforce, Palantir, or Sierra. What is the likelihood that they will stumble over use cases that will appeal to millions? Some of them will become the entrepreneurs of this next generation of software created in partnership with AI. But many of their ideas will be adopted, refined, and scaled by existing professional developers.

The Journey from Prototype to Production

In the enterprise, AI will make it much more possible for solutions to be built by those closest to any problem. But the best of those solutions will still need to travel the rest of the way on what Shyam Sankar, the CTO of Palantir, has called “the journey from prototype to production.” Sankar noted that the value of AI to the enterprise is “in automation, in enterprise autonomy.” But as he also pointed out, “Automation is limited by edge cases.” He recalled the lessons of Stanley, the self-driving car that won the DARPA Grand Challenge in 2005: able to do something remarkable but requiring another 20 years of development to fully handle the edge cases of driving in a city.

“Workflow still matters,” Sankar argued, and the job of the programmer will be to understand what can be done by traditional software, what can be done by AI, what still needs to be done by people, and how you string things together to actually accomplish the workflow. He notes that “a toolchain that enables you to capture feedback and learn the edge cases to get there as quickly as possible is the winning tool chain.” In the world Sankar envisions, AI is “actually going to liberate developers to move into the business much more and be much more levered in the impact they deliver.” Meanwhile, the top-tier subject matter experts will become programmers with the help of AI assistants. It is not programmers who will be out of work. It will be the people—in every job role—who don’t become AI-assisted programmers.

This is not the end of programming. It is the beginning of its latest reinvention.


On April 24, O’Reilly Media will be hosting Coding with AI: The End of Software Development as We Know It—a live virtual tech conference spotlighting how AI is already supercharging developers, boosting productivity, and providing real value to their organizations. If you’re in the trenches building tomorrow’s development practices today and interested in speaking at the event, we’d love to hear from you by March 5th. You can find more information and our call for presentations here.

Read the whole story
alvinashcraft
7 hours ago
reply
Pennsylvania, USA
Share this story
Delete

SE Radio 654: Chris Patterson on MassTransit and Event-Driven Systems

1 Share

Chris Patterson, founder and principal architect of MassTransit, joins host Jeff Doolittle to discuss MassTransit, a message bus framework for building distributed systems. The conversation begins with an exploration of message buses, their role in asynchronous and durable application design, and how frameworks like MassTransit simplify event-driven programming in .NET. Chris explains concepts like pub/sub, durable messaging, and the benefits of decoupled architectures for scaling and reliability. 

The discussion also delves into advanced topics such as sagas, stateful consumers for orchestrating complex processes, and how MassTransit supports patterns like outbox and routing slips for ensuring transactional consistency. Chris highlights the importance of observability in distributed systems, sharing how MassTransit integrates with tools like OpenTelemetry to provide comprehensive monitoring.

The episode includes advice on adopting event-driven approaches, overcoming leadership hesitancy, and ensuring secure and efficient implementations. Chris emphasizes the balance between leveraging cutting-edge tools and addressing real-world challenges in software architecture.

Brought to you by IEEE Computer Society and IEEE Software magazine.





Download audio: https://traffic.libsyn.com/secure/seradio/654-chris-patterson-masstransit.mp3?dest-id=23379
Read the whole story
alvinashcraft
7 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Resisting Coercion and Conformity Demands a Conscious Design: Here’s How to Get Started

1 Share

Kim Scott is the author of Radical Candor: Be a Kick-Ass Boss Without Losing Your Humanity and Radical Respect: How to Work Together Better and co-founder of Radical Candor, a company that helps people put the ideas in her books into practice. Subscribe to her Radical Respect Newsletter on LinkedIn to get more tips for dismantling coercion and conformity so you can build a just workplace where Radical Candor can flourish.

Brutal Ineffectiveness is Bad for Everyone

Coercion and Conformity

Brutal Ineffectiveness is what you get when, in addition to management systems that unconsciously reward conformity, the systems optimize for coercion rather than collaboration, producing more outright bullying and harassment.

The Weinstein Company was an example of Brutal Ineffectiveness. So is Elon Musk’s Twitter/X, in a different way. The board of Uber made the determination that Travis Kalanick’s behavior was both brutal and ineffective and removed him. The Jim Crow South, apartheid South Africa, and Putin’s Russia are other examples of Brutal Ineffectiveness.

Brutal Ineffectiveness is worst soonest for the people harmed, but in the end, it’s bad for everyone. If we take the long view, everyone has a practical interest in changing these systems, even the people who benefit from them in the short term.

Sometimes Brutal Ineffectiveness springs from an evil leader, but it often springs from management systems that fail to hold people accountable for bad behavior or that even reward bad behavior. The assholes begin to win, and the culture begins to lose.

Power dynamics, competition, poorly designed management systems, and office politics can create systemic injustice in ways that may be subtle and insidious at the outset but over time become corrosive, and often even criminal.

And, really, who cares about the leader’s intentions? We should demand the same good results from leaders who create management systems as we do from CEOs when it comes to profitability. If the systems reflect and reinforce the injustice in our society, they need to be changed. If a leader can’t figure out how to change the system, the leader must go.

The examples of Brutal Ineffectiveness don’t have to be as dramatic as the Weinstein Company or as bloody as Stalinism.

Think about a time in your life when leaders demanded conformity and therefore hired homogeneous teams, passing over the most skilled people for promotion, touting their meritocracy while actually creating a mediocracy. And there were no consequences for bullying or harassment, so these behaviors were common, making it difficult for many to do their best work. A vicious cycle ensues.

It’s hard to understand how or why we let things get so bad that we land in Brutal Ineffectiveness. Considering discrete problems like bias, prejudice, bullying, discrimination, harassment, and physical violations can’t explain it.

To understand, we need to consider the dynamics between these attitudes and behaviors, and the vicious cycles such dynamics can set in place. How does bias lead to discrimination? To harassment? To physical violations? And are there times when a vicious cycle ensues?

For example, a man believes women don’t handle stress well, so he discriminates against women, offering them lower-paying jobs. Having less power in the office, a woman is more vulnerable to being sexually harassed. Not surprisingly, she seems stressed, reinforcing his bias.

What can we do to disrupt these dynamics and set in place a virtuous cycle that leads to systemic justice, instead of a vicious cycle that creates systemic injustice? What moves us away from collaboration and respect. Partly it’s the discrete attitudes and behaviors already discussed. But it’s also the dynamics between them.

Let’s look at how coercion and conformity moves us away from collaboration and respect. 


Be One of Us or Make Way for Us: The Harmful Effects of Conformity Culture

The Conformity Dynamic drags us away from respecting individuality, usually offering a pretense of being rational, civilized, polite. But this dynamic excludes some people in a way that is not at all rational and can cause as much or even more harm in the long run as outright violence.

The Conformity Dynamic implicitly conveys an ancient message: Be one of us, or make way for us. And for many employees, of course, conforming to that “us” is not desirable or even possible.

There are many things about myself I don’t want to change—my gender, for example; and others I couldn’t change even if I wanted to, like my age or my height. And when people are excluded from opportunity or subjected to unjust policies because they can’t or won’t conform with an arbitrary norm, it leaves them vulnerable to abuse, both emotional and physical.

The Conformity Dynamic: Coercion and Conformity

The Conformity Dynamic often masquerades as “polite” or “professional.” This is BS. The fact that it’s not overtly violent doesn’t mean that it isn’t destructive.

Shortly after I joined Google, a colleague told me not to wear a pink sweater to a meeting with the executives. The basic message, offered in the guise of helpful advice, was this: Try not to look too much like a woman in this meeting. He thought he was being helpful, but he was reinforcing gender bias. A white colleague told a Black colleague to cut off his locs before an important meeting. The basic message, offered in the guise of helpful advice, was this: Try not to look too Black in this meeting.

If either of these people had wanted to be truly helpful, they would at the very least have acknowledged that in a more just world, they would have offered feedback to the leaders in these meetings to focus on the real work and not the sweaters or hair of their employees.

The Conformity Dynamic is reflected in the polite racism that Martin Luther King Jr. decried in his liberal allies in the North. People use the absence of explicit violence in their behavior to deny the harm that their attitudes and behaviors cause, to ignore the systemic injustice that results. The Conformity Dynamic plays out in different ways for different people. Here is a story about how it played out in my life.

The Conformity Dynamic: The Effects of Erasure

The Effects Of Erasure: Coercion and Conformity

When I was 7 years old, my parents were playing tennis at their club as I amused myself by picking wild blackberries along the fence. Suddenly, two men approached the court. I was nervous because I knew the club’s rules. Women were not allowed to be members; my mother and I were there as my father’s guests.

This translated to the following hierarchy for the tennis courts: If two women were playing, a man and a woman could take their court. Once the man and woman started playing, if two men walked up, the men could boot the man and the woman off the court. This, I feared, was about to happen to my parents.

But then my mother, who was seven months pregnant, pointed at her belly and said to the two men, “I have a man-child inside of me. So there are two men on the court.” The two men accepted this logic and went off to find another court.

I was astonished. My embryonic brother’s penis had carried the day in a way that my brilliant, creative, strong adult mother could not have. I was outraged by the injustice of it. At school, we would never have been allowed to invent such ridiculous rules to exclude kids we didn’t want to play with. But this was the sexist hierarchy that governed our existence.

When I got my first summer job at a bank in Memphis, an executive said to me, “Why, I didn’t know they let us hire pretty girls!” I was eighteen, and I had no idea what an “I” statement was or how to respond. So I said nothing. I just felt deflated.

This kind of erasure wore down all but the toughest women. And while I was getting underestimated as a result of my gender, I was getting overestimated as a result of my race. I was in denial about both dynamics for much of my life.

It was conversations with women who weren’t white that helped me notice what was really going on. This speaks to the importance of solidarity between people of different identities to challenge all the behaviors that contribute to a vicious cycle. United, we can create a virtuous cycle.

The Coercion Dynamic: How Words Can Reflect and Reinforce Patterns of Violence

The Coercion Dynamic, which drags us away from collaboration and makes no pretense at being polite—it is brutal.

The Coercion Dynamic, which happens when people use their power to coerce others rather than creating a collaborative environment, is an equally ancient, well-worn path that leads from bias to bullying to harassment to violence. If you aren’t aware of that path from bias to violence, you might give unconscious bias a “pass.”

Simon himself wasn’t overtly threatening me, but he was normalizing a sinister, criminal notion—that people think that having sex with someone too drunk to give consent is just a “party foul.”

But because bias can give way to violence, acknowledging that it matters is important, and we must take bias seriously. My lived experience of the Coercion Dynamic has been of a privileged sort. I have rarely had to fear for my physical safety. But here is a story that illuminates why it’s vital to recognize it, not to deny it.

The Coercion Dynamic at the Holiday Party

The Coercion Dynamic: Coercion and Conformity

I went to a holiday party a few months into a new job. The company’s employees were predominantly (over 70%) men, so just walking in the door, I was a little intimidated. I was greeted by women, mostly naked, dancing in cages. That didn’t help. As I did too often in my career, I tried to ignore what was happening around me. Women dancing in cages?Someone’s terrible idea of a joke, I reasoned. I tried to ignore how uncomfortable I felt.

I looked around for a familiar face. A colleague, Simon, was headed my way. He handed me a beer. At first, I was glad to see him. Then Simon ruined everything by asking, “Do you know what a Southern girl’s mating call is?” I said I didn’t want to know, but Simon told me anyway: “Y’all, I’m so drunk.”

I didn’t feel physically threatened by Simon, exactly, but this brief exchange tripped all my sensors. The context of the party mattered—predominantly men. At college, at business school, and throughout my career, I’d been in male-dominated environments. I’d had enough good experiences to know that 99 out of 100 men posed me no harm. And I’d had enough bad experiences to intuit that one out of a hundred would sexually assault me in some way if he got a chance. I just didn’t know who that one man was. I didn’t think it was Simon.

But at the very least, Simon was signaling that he was not an upstander. He was reminding me—even if he didn’t realize it—that it would not be wise for me to let my guard down that evening.

If we lived in a world where the Coercion Dynamic did not create a well-worn path from bias to sexual violence, his behavior would have been “only” bias. He was unconscious of the implications of what he was saying. He didn’t mean it. A discrete event.

But given the world we did live in, he was reflecting and reinforcing rape culture. Even if he wasn’t aware of it, ignorance was no excuse.

Discrete Incidents Vs. Dynamics

coercion and conformity

It’s important to understand the difference between a discrete incident and an incident that is a part of a dynamic that leads from bias to violence and contributes to systemic injustice. A discrete incident is bad but is far less threatening than the dynamic that carries with it the threat or past experience of violence.

A man in tech can experience gender bias, but not sexism or misogyny. Sexism describes the dynamic between gender bias and discrimination, and misogyny describes the dynamic between gender bias and violence. A white person in the United States can certainly experience racial bias, but not racism. Racism describes the dynamic between racial bias and both discrimination and violence.

When I hazed my colleague Russ during the podcast recording, saying that he was “born doing the power pose,” he experienced a discrete incident of bullying. He was in no way concerned that my behavior, while admittedly bad, posed any threat to his physical safety, nor did this incident trigger past experiences he’d had where a woman’s bullying of him became violent—because he hadn’t had any such experiences, nor had anyone he knew.

My behavior was not part of a pattern in which women committed acts of violence against men. It did not play into the Coercion Dynamic, that well-worn slippery slope from bias to violence. It was bullying, but it wasn’t misogyny (the dynamic that leads from bias to violence against women) or misandry (a theoretical but rarely seen dynamic that leads from bias to violence against men).

However, when Simon told me the rape joke, it was both bias and misogyny. I felt a menacing undercurrent. Simon himself wasn’t overtly threatening me, but he was normalizing a sinister, criminal notion—that people think that having sex with someone too drunk to give consent is just a “party foul.”

Whether he intended to or not, he was reminding me that I wasn’t physically safe—especially if I had a drink. I’m not saying intentions don’t matter. At the same time, impact also matters.

I don’t think it’s too much to expect Simon to be aware of this dynamic or to hold him accountable for not playing into it. I knew Simon well enough to be pretty sure he did not think of himself as a person who would rape a woman or condone rape.

However, if he wanted to show up to others as the kind of person he envisioned himself to be, he needed to understand the context in which he was making this joke and the impact it had. If we are going to cultivate Radical Respect, we must be aware of the dynamics that can lead us from bias to discrimination to abuse or from bias to bullying to violence.

Even if we ourselves have never committed an act of violence and don’t think of ourselves as the kind of people who ever would, we need to be willing to notice the ways our words can reflect and reinforce patterns of violence.

Beyond Command and Control: The Collaboration Hierarchy

The work environment at Google during my time there was no accident. SVP for business operations Shona Brown had optimized its organizational design to maximize effectiveness and innovation.

Shona believed that in the modern economy, command-and-control management just doesn’t work that well. Bureaucracy is inefficient and kills innovation. Her insight was that top-down leadership, where worker bees are told what to do and how to think, stifles productivity and creativity. But you still need hierarchy.

Early on, Google’s founders had experimented with getting rid of managers altogether. That didn’t work. Shona’s insight was that while dominance hierarchies are bad for innovation, a collaboration hierarchy can work.

“If this were an ordinary company, I’d make you all do it my way!”

There was still an organizational chart with a CEO, VPs, directors, managers, and so on. But in this model, leaders at all levels were subject to real checks and balances that were baked into the company’s management systems, processes, and organizational design.

The idea was to strip managers of traditional sources of power, such as hiring, promotion, and salary decisions. This authority was given instead to teams, which were likelier to make better decisions. No leader at the company, not even the CEO, could hire people without putting them through a hiring process or promote people without putting them through a promotion process.

Managers couldn’t just pay bonuses or decide salaries unilaterally. Nobody could coerce employees to do something they didn’t want to do. I’ll never forget watching an argument between one of the three most senior leaders at the company and a group of engineers working on a project.

The executive proposed one approach. The team had a different idea. The executive couldn’t convince them, so he suggested taking three or four of the hundreds of engineers working on the project to do a small Skunk Works proof of concept for his idea. The team demurred.

“If this were an ordinary company, I’d make you all do it my way!” exclaimed the executive. “I just want to try this idea out.”

The team explained again why the executive’s idea wouldn’t work and why it would be disruptive to have even three or four people pursuing it. He allowed himself to be overruled. This kind of behavior requires a high level of trust going both ways. That’s what a good system does: it allows trust to thrive.

“Having a bully for a boss was an asshole tax that Google felt nobody should have to pay.”

Across the board, processes at Google optimized for collaboration and discouraged coercion. When performance reviews came around, managers were rated by their employees as well as vice versa.

When people did behave badly at Google, they usually got extremely quick and clear feedback from their peers and their manager. And when the person behaving badly was the manager? Even before the manager’s boss found out about and corrected this behavior, team members would abandon the manager.

Google made it easy for employees to switch teams without their manager’s approval. Having a bully for a boss was an asshole tax that Google felt nobody should have to pay.

The purpose of the management hierarchy was twofold: one, to ensure accountability; two, to provide a coaching and mentoring service to help employees grow. Managers were held accountable, but they were not given much “control” to get things done.

Building Better Relationships Beats Command and Control

coercion and conformity

They had to rely on building real relationships with each of their employees and on inspiring or persuading people to get things done. The management structures at Google discouraged a command and control, “tell people what to do” kind of leadership.

In fact, using managerial authority to coerce others without allowing them to challenge you was one of the few ways a manager could get fired. Instead, everyone at Google was expected to work collaboratively, and ideas came from any and all directions.

A workplace that optimizes for collaboration and honors individuality is something you have to strive to achieve and maintain monthly, weekly, even daily, hourly. Think of your workplace as being at the top of a steep hill. You have a spectacular view, but you have to climb that hill every day to enjoy it.

“That’s what a good system does: it allows trust to thrive.”

Or think of it as a building. If you hire good engineers and workers, use quality materials, and build a strong foundation, your building will last longer than if you don’t. But even a well-made building can quickly become uninhabitable if you don’t clean and maintain it.

Life is change. If you don’t revisit and buttress the safeguards in place to make sure that coercion and conformity aren’t creeping into the way people work together, then workplace injustice and the inefficiency that accompanies it will take over your culture.

The aspects of human nature we are least proud of will always be pulling us away from efforts to collaborate and toward the instinct to coerce; away from respecting individuality and toward demanding conformity. Daily attention is needed to resist these forces and keep your workplace just.

This post was adapted from Radical Respect: How to Work Together Better. The book is now available as part of the Spotify Premium audiobook catalog and as a LIT Videobook.

 

Read the whole story
alvinashcraft
7 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Dinner and a Show: Where to Eat Near Philly Theaters

1 Share

With over two-dozen professional stagehouses across the region, Philadelphia ranks as one of the country’s great live theater towns.

And when you wish to turn your excursion to a dramatic play, rousing musical or laugh-out-loud comedy into a full night out of dinner and a show, each and every Philly theater offers a bevy of restaurants, bistros, taverns cantinas, and gastropubs a short walk away for grabbing a meal and a drink before the curtain goes up.

Whether you’re planning a special pre-show evening at a fine dining establishment or grabbing beers and bites at a local brewery ahead of showtime, here are over 70 spots near your theater destination.

For unbeatable deals to shows at nearly two dozen area theaters, be sure to score your tickets from TKTS, now available in Philadelphia.
Read the whole story
alvinashcraft
7 hours ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories