The End of Work, or the Beginning of Radical Efficiency?

Originally posted in Enlighted blog 4/4/16

First predictions around the emergence of new technologies are strange – they can be way off base but mysteriously contain an impossibly prescient grain of truth. Often it’s simply a matter of our getting the timing wrong. Take the “paperless office” idea that surfaced when the internet first started. The initial impact of advanced electronic communication was that paper consumption increased, peaking at about 2008, then began to decline, reaching pre 1994 levels by 2014. Now paper based communications seem to be increasingly a relic of the past, and for good reason. So the prediction was mostly accurate, but off by a few decades.

Lately memes about the End of Work are firing in the synapses of the global brain, and not for the first time in history. For instance, electrification in the early 20th century unleashed a flood of new labor saving devices that promised sweet salvation from soul-crushing labor to one sector of society – housewives. What happened instead was simply that some labor was transferred to machines and domestic labor was rather more redefined than utterly transformed. The “science” of home economics was duly invented to help homemakers to gain efficiency in their efforts, but ultimately someone still had to organize the clothes and dishes and run the vacuum cleaner. Disillusionment with this state of affairs contributed greatly to women entering the workforce en masse and seeking equal rights and pay, a process that began in the first few decades of the 20th century and continues today.

So whether there is any truth or not to a vision of the future where we evolve to become like the obese helpless slug people in Pixar’s Wall-E remains to be seen, and one could fret about the long term effects of sitting in front of a laptop all day (like I’m doing now), especially when you consider that our bodies evolved over millions of years in the savannah to do physical work, hunting, gathering, and escaping predators. And we’re not the only species that requires physical work for psychic stability. Faced with widespread unemployment, elephants in Burma evidently suffer stress, loss of morale, and weight gain just like we do.

Those of us who think deeply about and create the built environment have a lot to contemplate these days as the fundamental nature of work changes. We’re tasked with understanding how the environment affects and is affected by things like productivity, health and wellness, creativity, collaboration, and the organization of corporate structures. As we transition into what Jeremy Rifkin calls the Third Industrial Revolution, we’re seeing a persistent slowdown in global GDP, and a flat curve on aggregate efficiency, or the ratio of actual work to useful work. From an economic perspective, part of the problem is in the metric of GDP itself, which was created to measure industrial and agricultural output, and not the useful product of communications technologies created and proliferated by companies like Google, Facebook, and Twitter, most of which doesn’t cost the consumer anything. Despite many proposals, we haven’t yet come up with a suitable replacement for GDP.

One perspective is to examine the mental models of what type of person has an evolutionary advantage in the working world over the last 100 years or so. Louis Menand, in this piece in New Yorker, The Efficiency Trap, provides a brief history of the evolution of self-improvement literature, pointing out that we fool ourselves in equating personal growth with the acquisition of characteristics that assure our suitability for work alone rather than for life in general. In the first industrial revolution, where physical strength and agility were still important attributes, one’s output was easily measured by how much physical product got produced per unit of labor and the personality traits best suited for work were persistence, stamina, and the ability to habituate to highly repetitive tasks. In the second industrial revolution – the transition to the service economy – workers who were eminently likeable, could communicate with customers and clients and win lucrative contracts were highly valued. In the third industrial revolution – today’s information economy –customer relations are outsourced and employers want people who are “hybrid” animals, with equally high IQ and EQ. According to Australian entrepreneur Michael Priddis, “Computers are good at the jobs we find hard, and bad at the jobs we find easy.”

Today the ability to interpret data and make good use of it is a crucial capability, and many new job descriptions contain the word “data”– data scientist, data analyst, even Chief Data Officer. This also includes the ability to use data about the workplace and use it to constantly improve the work environment, an activity increasingly accruing to the HR function. And the HR function is being transformed dramatically as well, as in the information economy, corporations are increasingly focused on their “human” resources rather than their physical plants. They are also increasingly focused on physical plant as if affects the humans in it. The new role of Chief Performance Officer could rely on emotional data from employees, collected persistently and analyzed to measure the underlying drivers of efficiency and productivity.

With increased mobility and employee turnover, corporations have a considerable incentive to make employees feel that they can take ownership of the workplace, including control of comfort factors and health and wellbeing. A powerful way to do this is to give them increasing access to data about the workplace and the behavior and interactions of the people in it. This is not an easy task to be sure, especially considering privacy concerns, but as hierarchies flatten in business structures and decision making becomes increasingly decentralized, we’ll get used to sacrificing privacy for control and flexibility.

Mr. Rifkin points out that because much of the world has not yet transitioned out of second industrial revolution models, aggregate efficiency has stagnated at about 15% globally for many years. But he theorizes that as we evolve further into the third industrial revolution, we’ll reach levels approaching 40%. This of course includes radical automation of many current job functions, and many if not most of these will not immediately be replaced with new jobs. Robots will eat many jobs, partly because they can, and partly because they should. The platform for this efficiency increase will be IoT, which, true to historical patterns of connective innovation, brings together advances in communications, energy, and transport.

What will this kind of radical efficiency mean to how we work? It’s difficult to fully imagine, given the unprecedented levels of technology, economic and environmental transformation we’re dealing with now and will deal with in the near future, but it seems clear that we need to stay within our physiological limits, and that we need to continue to understand them better. Our bodies did not evolve optimally to operate punch presses, drive cars, or sit at desks in offices under bad lighting all day, so whatever paradigm of work we create for ourselves carries a cost in physical and emotional stress. New kinds of scientific inquiry are emerging and beginning to be applied to behavior as it relates to the built environment, including embodied cognition- the idea that our environment influences how we think and behave. We’ll need new ways of interpreting all the data we’re beginning to collect, and new theoretical models to help us understand it. It’s increasingly risky to plan and operate workspaces based on models that evolved under previous economic conditions. Fortunately we’re seeing tremendous opportunity to design smarter, more flexible, healthier workplaces with a vast array of new tools. 

The New Productivity: How the Office Environment Affects Output

Many new analytic tools promise to improve productivity, but productivity in today’s office environment is persistently difficult to measure and manage, as it has been for decades. As the global economy becomes increasingly complex, many regional and local economies are still transitioning from industrial production where productivity (output) is easily measured in units per input (energy, labor, and materials) to an information economy where useful metrics are much harder to define and employ. As it does in almost everything these days, technology plays a pivotal role in this process, and even though it’s creating a lot of the problems, it’s also offering new solutions, including sensor and analytic systems, better adaptive design techniques, and new metrics.

One of the bright spots in using data to make better office environments is our improved ability to measure and correlate behavior with results. Behavioral research in the built environment is not a particularly large or robust field, and so far has relied heavily upon survey methodology to gather useful data. So almost all our learning is based on post-occupancy studies that are conducted by people with questionnaires asking questions, or even less effectively, online forms with multiple-choice buttons to click. People self-report about their behavior, and this has repeatedly been demonstrated to be a rather inaccurate method, to say the least. For instance, in one Enlighted installation, building occupants who were surveyed reported a very different level of conference room use than the measured data eventually showed. Embedded analytic systems enable companies to, in a sense, conduct persistent research, eliminating the need for expensive after-the-fact studies and surveys that often produce no actionable results, or worse – inaccurate data and conclusions.   

This new capability has enormous potential to transform how we build office space, because it solves one basic behavioral problem with metrics: we focus on things like quantity of output and self-reported occupant “satisfaction” simply because the metrics are relatively easy to use and not because they’re particularly relevant as evidence. For instance, organizations who even take the initiative to do post-occupancy evaluations of building projects often do them just to get a LEED point, and the data is rarely shared or used to inform new design. Fortunately, new metrics that focus on actual, persistently measured and analyzed behavior are emerging that will prove to be far more useful in designing and managing work spaces that are much better adapted to how people work today. I’ve written about some of these here and here, and more will develop as we explore the new capabilities of measuring behavior and using what we learn to make better buildings.

Even simply measuring how people move in spaces throughout the day can uncover valuable pattern recognition data, previously hidden in plain sight, as we can now see a visual realtime map of interpersonal interaction, which by most accounts is one of the key indicators of workplace satisfaction. As online communication becomes so ubiquitous as to begin to seriously impair concentration (one of the previously mentioned problems exacerbated by technology- I’ll get back to this a bit later) we begin to focus on how important face-to-face interaction is, especially casual or accidental encounters like those we experience in good walkable city neighborhoods. It almost seems to be the entire reason for having offices in the first place. I don’t know about you, but those pictures of people with laptops working on their lounge chairs on the beach with a cold one in hand don’t make any sense to me. I’ve tried it, and I can’t really work in settings like that, I just want to get out on the water. Despite the pervasive techno fantasies of omni-locational work, there are real limits to where we can actually get anything constructive done that resembles actual work in some tangible way.

Leading architecture firms like Gensler and Perkins + Will have been researching the impacts of office design on productivity for many years. Gensler in this 2013 report summarizes some the driving design ideas nicely: Provide effective focus space; collaborate without sacrificing focus; and drive innovation through choice. Providing choice – ability to control one’s environment – has been a pervasive theme in post-occupancy surveys and points to important user centric design features like operable windows, granular zone control of lighting and HVAC, tunable, dimmable mobile desk lamps and even smart chairs like the HyperChair. Of course, even the illusion of control can have the same effect in occupant satisfaction, and is evidenced by things like dummy thermostats and elevator buttons. It’s like adjustable shelves in furniture – everyone wants them but no one ever changes them once they’re set up.

But, as reported by Ben Waber, Jennifer Magnolfi, Greg Lindsay in this excellent article in Harvard Business Review, the challenges for office design today are manifold. The factors impacting the process are overwhelmingly complex: globalization; an increasingly mobile workforce; vastly different upgrade cycles of buildings versus systems; overcommunication; rapid boom and bust cycles for tech companies; the increasing cost of prime urban real estate; telecommuting; and a long list of others. In their article, Waber, Magnolfi, and Lindsay show fascinating examples of how different companies got widely varying results by following similar strategies – essentially tweaking the balance between collaboration and focus. Again, choice emerges here as an overriding factor in successful design strategies. But perhaps the biggest design challenge of all today is determining the appropriate level of choice. As culture and technology become increasingly complex at an increasing rate, choice overload is inevitable and often leads to cognitive shutdown, as shown by Barry Schwatrz in The Paradox of Choice. This applies especially to UI design for building systems, as I touched on in this recent blog.

On the theme of human interaction, a compelling finding by Waber, Magnolfi, and Lindsay is that the Allen Curve holds for online communication. According to the authors, the Allen Curve, postulated in 1977 “estimates that we are four times as likely to communicate regularly with someone sitting six feet away from us as with someone 60 feet away, and that we almost never communicate with colleagues on separate floors or in separate buildings.” It turns out that people in close proximity in office spaces are significantly more likely to communicate and collaborate online as well as face-to-face.

There are two unanticipated consequence of the rapid proliferation of online communications that seem to be particularly disruptive (not in a good way): overcommunication and the apparent decline of personal interaction. We all are familiar with the insufferable barrage of emails and texts at work that follow us home and to the beach and to dinner and to our daughter’s birthday party. Most of us feel that we devote way too much time to managing information rather than doing focused work. And because a lot of less critical information can (and should) be communicated online, we find that meeting in person and talking on the phone to be increasingly disruptive and unnecessary. But then we’re kind of starved for human contact in a sense, and we seem to have endless meetings about nothing and don’t feel like we’re getting anything accomplished. How the design of offices can change these feelings of uselessness and impact the changing nature of work varies greatly according to the company and its individual goals and workforce, as shown by Waber, Magnolfi, and Lindsay’s examples.

Building science in the past has focused largely on physics: R values, structure, materials, energy, behavior of water and air. Only recently have we made strides in understanding analytically how humans and buildings interact, not just the humans who occupy buildings, but the ones who plan, build and manage them. The emerging analytic tools that are now becoming available will help us greatly in learning to make offices and other building types more adaptive, efficient, sustainable and smarter. 

Persistent Insights: Embedded Analytics Redefine Evidence Based Design

Originally posted in Enlighted blog 3/10/16

A promising and surprising trend uncovered by embedded sensor and analytics systems like Enlighted’s Space application is that previously unquestioned assumptions about how people use buildings often begin to crumble in the face of granular realtime data and the insights it provides. Space is only one of many current types of applications impacting the design and planning of facilities- there are dozens more potential ones on the near horizon. And the impact of these systems extends beyond improving the end results of design – it can dramatically redefine the most important part of design, the front end research.

Industrial behavioral research as we know it today began with the work of Frederick Taylor, whose time and motion studies were among the first “scientific” methods aimed at counteracting the unintended consequences of ramped up industrial production made possible by electrification. One result of the standardization and rapidly increasing mechanization of industrial work is that people whose jobs used to be diverse and interactive suddenly found themselves with narrow, highly specialized jobs, standing or sitting in front of deafening single function machines, pulling a single lever for ten hours a day as identical widgets rolled by on the assembly line. The unforeseen social consequences of mechanization included devastatingly high labor turnover, drops in productivity, and low morale, and became dealbreakers for the economy. Since behavioral science itself was fairly new and unproven, it was natural that corporations assumed that what worked for machines – rational, cause-and-effect, physics – would work for people too. This viewpoint essentially turned people into machines and did not go over well, to say the least. One result of the violent reaction of the workforce was what we now recognize as corporate welfare, which ultimately became our current health care system, as corporations made more and more concessions to workers in an effort to lower the turnover rate.

Today behavioral and cognitive science has made much progress, but has not yet really begun to see its potential to address crucial problems we now face in the workplace. What we call “building science” is applied to more easily quantifiable physical things – the energy use, materials, systems, and configurations of the buildings themselves, not the people who live, work, and play in them. And few corporations can afford more rigorous research like that conducted by William Whyte in the 1970s, especially for indoor environments, where it’s difficult to unobtrusively observe how people work and socialize in the office environment. Who wants to work with someone standing around all day with a clicker counting people and making notes on the frequency of their trips to the water cooler? Fortunately most of the repetitive, quantitative nature of this kind of data gathering has been highly automated and has become much more accurate, efficient, and affordable. The problems of attracting and motivating workers and stemming turnover are similar to those of a century ago, but facilities planners still don’t have the data and deep understanding they need to tackle productivity. Millenials evidently want bean bag chairs, cuddle rooms, sushi bars, espresso machines, foosball tables, and a balance between social and private space, but how much, where, and most importantly, why?

Behavioral research is a critical part of design and planning but is not always particularly valued in our business culture today. In the past, research meant squadrons of people in white lab coats with clipboards, asking questions and making observations. Research projects were slow and expensive and results were often misunderstood, simply ignored, or worse, based on bad science in the first place. This is one of the reasons why there’s not much in the way of useful behavioral research in the built environment. Also, what’s available is focused almost exclusively on energy efficiency.

One problem with built environment research is an over-reliance on techniques like post-occupancy building surveys, which are typically done with the crudest tools available and characteristically fail to deliver rich insights to crucial behavioral questions. According to this article, such surveys suffer from a range of design problems, including selection bias, small sample sizes, and low accuracy and used alone, which is often the case, are simply inadequate in drawing conclusions about building performance. A key reason for this is that they rely on self reporting and conscious choices rather than independently observed, unconscious behavior. Because of cost and time constraints, these surveys are typically done over the web and less often in interview situations where unskilled interviewers routinely subconsciously influence results.

What to do with all the new smart analytic capability that’s so easy to implement? Let me count some of the ways. For starters, I think we need to forget about the typical way we practice “research” in the built environment today, a proposition that with many risks and unclear rewards. It involves spending a lot of time and money on experts, trying to prove one thing and more often than not accidentally stumbling on something else that’s much more useful. The combination of powerful data gathering systems and analytics can transform the process of learning to make better buildings and free us to focus on the real problems, which are behavioral, not technical.

What I’m going to christen here as the fabulous new Paradigm of Persistent Insights means that with a rich sensor network enabled with a wide range of input devices, crucial behavioral, emotional, and experiential data can be automatically aggregated and analyzed for whatever investigations are important – we can focus on the insights we need rather than the mechanism of collecting the data that allows us to uncover them. In fact, within certain parameters, built spaces can be seen almost as permanent experiments. In many cases there’s no need to devote resources to replicating a retail, office, industrial or institutional environment just so that you can study it closely – you can change things on the fly and observe many direct and interactive effects, simultaneously if desired.

The first obvious example is with lighting, which is visible, increasingly digital, easily controlled, and of course the backbone system for many sensors. We’re constantly increasing our understanding of how lighting impacts behavior like productivity, learning, buying behaviors, and health outcomes. Any retail space equipped with dimmable and color tunable lighting and a sensor based analytic system can get immediate feedback on the effects of changing lighting on foot traffic and sales – this is directly observable data. This applies to offices wanting to test the effects of lighting on productivity as well, of hospitals needing data to document the effect of lighting on recovery times, patient complaints, of fatigue in nurses, for instance.

Or consider the movement of people through a building, which is now highly mappable with Space. With the new flexible open plan (on not) systems, workplaces are highly configurable. What happens to human circulation patterns, social interaction and productivity under different space configurations and adjacencies of amenities? Then consider the interactive effects of lighting, thermal comfort, and circulation. The possibilities are endless, especially when captured data on existing spaces is used to inform the design of new ones. Certain patterns inevitably emerge that many designers recognize intuitively but can’t always articulate with the benefit of quantifiable data.

What will provide some boundary lines of course are limits on companies’ ability and willingness to tinker with their workers like so many guinea pigs and to risk downtime as “experiments” are configured and complex, intersecting insight parameters determined. But with analytic system in place and relatively feasible to implement, what we can now focus on is the important questions that drive business, like employee health and well being, customer experience.

Even if you only want to focus on energy in buildings, as most people do, and don’t want to go in to the murky, scary world of behavior emotions, and experience, you really can’t avoid it. We have learned a great deal about building energy use and how to build much more efficient buildings, but we’re not doing it at the scale we think we need to, and our failure to do so is not based on a lack of technical solutions or advanced materials or software, it’s because we don’t understand our own behavior very well. Why do architects persist in making glass buildings and glorifying this aesthetic in the face of overwhelming evidence of their dismal thermal efficiency? Why do people not use window shades when doing so would save large amounts of energy? Why do people buy more vegetables under certain types of lighting conditions than others? When people claim to be “mostly happy” with their work environment but consistently leave a company after an average of six months, how much of a factor is the workplace, and how do we know? These are the kinds of questions we can begin to answer with a better approach to generating evidence based insights. With embedded analytic systems, companies can begin to answer them with real data and make better design decisions. 

The Envelope Problem

Originally posted 2/29/16 on ENLIGHTED

Building scientists, architects, and engineers think of building enclosure in a very fundamental way as the “envelope” – the physical separator between the conditioned and unconditioned environment of a building including the resistance to air, water, heat, light, and noise transfer. Economically and culturally the building industry is roughly divided into two camps – what I call “outside the envelope,” mostly publicly owned, and “inside the envelope,” which is mostly privately owned. The “outside” ecosystem includes mostly public infrastructure like roads and street, sewers, bridges, parks, and the power grid; while the “inside” ecosystem is mostly private buildings – offices, stores, houses, institutions, factories. Each ecosystem has its own group of owners, investors, and consultants who specialize in planning, funding, designing, building and operating what gets built or rebuilt. One thing of rapidly increasing importance in the built environment today that is not contained by building envelopes is data.

The common connection between these two ecosystems is the electrical power grid, which is like the central nervous system for civilization. Information and energy (and especially information about energy) are increasingly flowing along the same networks. But because responsibility for each belongs largely to different groups, there are significant barriers to sharing applications and data across the boundary between private buildings and public cities.

A Fundamental Application Break
In order for networks and data in fundamentally different realms to exchange data, they need a mediating mechanism, like a translator between two people who speak different languages. But so far these are hard to find because of legal, economic, and security issues. According to Jay Shuler, president of Shuler Associates and a Smart City/IOT expert:  “It’s unusual for an application to address both environments, or for data to be shared between indoor and outdoor applications, but that is what is needed. And there’s a lot of ambiguity about who owns public data gathered by private agencies working for public entities, like when smart streetlights monitor crowd movement patterns, but that kind of data needs to be shared. It’s very easy for decisions that should be made rationally to devolve into emotion, politics and self interest.”

One such decision is now unfolding in a very public, painful, and portentous way with the recent conflict between the FBI and Apple over breaking the code on the iPhone of one of the perpetrators of the massacre in San Bernardino in December. Cases like this, where privacy and security run smack up against grave issues of national security, are bound to surface more frequently as buildings, cars, devices, streets, light fixtures, and of course, people throw off more and more data into the ether. Questions about who owns the data and when to employ deep granularity to any data set will only increase the more we employ sensor networks and advanced analytics.

Technical Issues and Tradeoffs
It seems that most of the technical issues people talk about in this area revolve around communications protocols – Zigbee vs wireless vs Bluetooth, etc. – and of course security. These issues overlap, as VLC (Visible Light Communication or Li-Fi) is being touted as more secure because it works only on line-of-sight and can’t pass through walls, and will eventually be resolved. Part of the reason that they’re so much the topic of discussion is that tech companies in the network, software, and hardware sectors know that determining and owning a standard greatly facilitates the rapid accumulation of wealth. But given the pace of innovation, which standards might eventually win seems to be increasingly difficult to project, and I think we should be seeing it more in terms of combined standards and protocols – mostly the more open source the better. Li-Fi, for instance, although it shows great promise because it greatly expands speed and available bandwidth, is uni-directional and not ready yet for commercialization. It may only take off when systems that combine it with other protocols that offer fast upload speeds are developed.

There are other technical issues that don’t seem to be discussed quite as much, specifically around data architecture. According to articles like this, we need a new one for IOT. The last estimate I could find of the number of photos uploaded to the cloud every day is 1.8 billion, and this has surely increased since then. And you may have noticed that we’re not all doing live video-chat on our wrist TVs yet, a tech development envisioned in 1964 in the comic strip Dick Tracy.  When we start doing that alone, not to mention all the other data gathering going on, the amount of data that will be collected, transmitted, stored, and analyzed is pretty much incomprehensible. With this massive increase in data collection globally comes a tradeoff between storage and computation, or between gathering and analyzing realtime data vs interval data, decision points that in many cases may want to be automated. A specific example of this might be when, during a terrorist attack, a masked perpetrator is captured walking briskly from the scene. Gait recognition analytic software can be applied to captured video and identify his unique gait pattern. This pattern is a much smaller data set compared with thousands of hours of video to analyze from other locations where the individual may be detected, but by filtering out a much smaller data set – basically vector points rather than full moving bitmap sets of images – targeting and filtering becomes much more efficient computationally. There are of course unlimited other scenarios for this that aren’t related to security and terrorism, but as security remains a top concern for everyone, we need to address one issue there that should be clarified.

Disaggregation of Identity Data
Most people take it for granted that ubiquitous data collection systems will basically collect everything that happens everywhere and retrieve anything at any time, including who you were with at Aunt Suzie’s birthday party last Friday and how much they had to drink. But per the example above, disaggregating granular identity information from massive data sets is a practical technical consideration as well as an issue of social justice. Very useful information can be gathered about how people move in spaces, what their emotions or health conditions are, what they buy, and how they drive without drilling down into their individual privacy – in fact doing so on a wide scale is highly impractical. Of course, like in the Apple case, the ability to drill down to the individual and access their private data is always there somewhere, and the debate now is, as it should be, about who gets to do that. So the real important discussion is not about the technology, what it can do, and what standards it will use, it’s about how we use it to improve our lives and our society on a global scale. Whatever the outcome, the Apple vs FBI case will have wide and immediate repercussions all over the world, especially in countries like China.

Collecting and Analyzing What Matters
There are many immediate practical reasons to cross the building envelope data barrier between inside and outside, private and public. Sharing building energy use data alone represents a huge resource for improving efficiency and stimulating innovation, and the federal government has created the Energy Data Initiative to facilitate this. As more granular energy monitoring systems for use “inside the envelope” evolve that provide useful data to utilities, public agencies, and regulators, everyone stands to gain by getting a more accurate picture of how to implement efficiency This becomes even more important when distributed generation and Smart Grid technology helps cities to evolve into net producers of energy.

Other important areas where data should be shared between inside and outside the building envelope include: health and wellness; environmental measures like pollution, biodiversity, water and air quality; traffic and parking; retail activity; pedestrian activity. We’re at the very beginning of our understanding of how these powerful tools can be used to improve the quality, sustainability, and economic vitality of our cities and buildings. We don’t experience the world as two completely different universes, public and private, indoors and out – we move freely between them constantly. We need to break through the “envelope” in how we use data in the built environment.

 

 

 

A Brand New Look at Lighting

On February 25, the IES San Francisco Section presented an excellent program with Angela McDonald and Faith Jewell, both of Horton Lees Brogden Lighting Design (HLB) in San Francisco. Entitled Elevating Brand Experience: Lighting Techniques, the talk uncovered some surprising aspects of lighting design.

When approached by IESSF President Marissa Tucci with a request to present a program, Angela and Faith thought it would be challenging and interesting to offer something a bit outside a typical lighting program format, on a subject that perhaps many practitioners don’t think about so often.

I was interested in the program as a marketing and branding consultant- for most of my earlier career I was involved in brand identity, research, and management and created brand programs for dozens of companies as well as designing and conducting hundreds of customer interviews. I started out on the design side and eventually realized that I’d rather be doing the most important initial work- making direct contact with real-life end users in a serious effort to determine what problems were most worth solving with design. To me this became the most important part of design. And because I couldn’t find anything else that really made sense, I created my own special theory of branding that involved five dimensions- category, character, benefits, difference, and credibility. I was quite fond of it naturally, but trying to get beyond pride of authorship I also think there was some rigor in my methodology that I picked up from a wide variety of sources, kind of a connective innovation.

As introductions to each project, Ms. Jewell and Ms. McDonald presented a bewildering variety of definitions of “brand” and “branding” from a range of thinkers – none familiar to me except Leo Burnett – who seem to have emerged since I last took a serious look at the whole thing (which was probably about a decade ago). Unfortunately, any real consensus about what constitutes a brand, or rigor in researching and measuring it seems to be as elusive now as ever, which is a bit disappointing but understandable. For me, statements like “every individual becomes a media entity” and “create your own personal brand” have diluted the operating idea of trade identity and company or product reputation that we used to work with. I was able to tease out some underlying themes though, having to do with “storytelling” ( another overused and potentially meaningless conceit) and experience. Of course I do believe that we have a hardwired instinct for narrative, and that this is part of how we deal with the word cognitively, so this is probably what people are referring to when they talk about “telling a story” through a space, a garment, or an ice cream cone. And the quality of customer experience, to to use the programmer-driven acronym UX, is especially crucial in the retail environment today as companies face a range of challenges, including figuring out the relationships between online and in-store sales, managing customer loyalty, and charting product roadmaps.

Ms. Jewell and Ms. McDonald have done a masterful job of digesting all the brandspeak that they have no doubt encountered with their teams on a wide variety of projects and using only what made sense to create wonderful design, and for this I heartily commend them. I’m a bit too close to it to be neutral, but none of what is to me monumental ambiguity in brandspeak seems to have gotten in the way of good design for them, indeed much of it seems to have helped. And they also demonstrated that we can greatly extend the experience and meaning of “brand” in the built environment through lighting, which is kind of amazing and like all great design, inevitably elegant once you see it.

The first example of this magic – I have to call it that – is their lighting of the Mission Street PG&E substation located at Eighth Street and Mission in San Francisco. Buildings like this are, from a New Urbanist perspective, death to the street, as they take up entire blocks or large portions of them, offer no visual relief or even windows, and tend to make pedestrian traffic wither. But historically, electrical substations are, to follow the logic of Ms. Jewell and Ms. McDonald, kind of a “brand” statement by the utilities, almost like temples to the God of Electricity erected by the organization with a kind of ultimate power over civilization- the “power” company. The statement subliminally could be “remember where your power comes from,” almost like the Egyptian monuments to the Sun God Ra. The earliest buildings for electrical generation in cities were horrendously dangerous and polluting plants powered by coal gas and because they were DC based, could deliver power effectively within a one mile radius, and so were also inconveniently numerous as well. Once the alternating current standard allowed the pollution generated by electrical generation to be exported safely outside city limits, power became clean and safe, Edsion’s propaganda campaign against AC nonwithstanding, and buildings that were needed in the city limits could become iconic, propaganda devices meant to convince people to use power and most importantly, to buy the growing number of devices that consumed it. Hence our industrial consumer based economy was quickly born. I may be overreaching a bit here, but in the early part of the 20th century, private utilities and companies like GE were actively involved in erecting monuments to electricity that drew heavily on classical monumental architecture and ancient deities.

The Mission Street Substation, designed in 1948 by William Merchant, is one example of such monumentalism which, according to SPUR, tends to defy human scale. And the building was not done without regard to aesthetics or some measure of civic pride. Ms. Jewell and Ms. McDonald realized this and made the most of the building’s inherent aesthetics, lighting it in a way that encompassed whatever original lighting may have been done for the building. The use of low angle grazing light to bring out the dramatic aspect of the local artist Robert B. Howard’s WPA-style bas-reliefs "Power" and "Light" is brilliant. And the designer’s restraint in foregoing typical rainbow color changing gimmickry in favor of a more dignified approach is quite refreshing. My experience as a longtime San Francisco resident is that of suddenly seeing a rundown, underappreciated landmark shining in renewed glory, probably better than the original. Honestly, it’s a relatively modest project, but this building is one of the best single examples of the transformative power of lighting alone I’ve ever seen. It’s so successful because the designers understood the historical context and aesthetic intent of the original building and used new technology to bring out the best qualities of the building, sending a positive “brand” message. A triumph!

Another brilliant project presented by Ms. Jewell and Ms, McDonald was the Jins Eyewear store, which addressed an important aspect of branding- how you can help a brand to adapt to changing conditions and customer preferences. Jins Eyewear, a Japanese eyewear company, developed a strong brand over the years that relied its store interiors on a lot of wood in that referred to the highly evolved craftsmanship of Japanese carpentry and was popular with a male audience. Eventually this became a liability as more women began to be targeted. The design team devised a look for the stores that was gender neutral and still used wood finishes in ingenious ceiling fixtures that provide task, ambient, and decorative lighting all in one as well as referencing traditional Japanese wood craftsmanship. The resulting experience of the stores is greatly improved while retaining the valuable brand equity of the stores .

Jins Eyewear's elegant lighting and integrated design approach delivers a superior customer experience.

Jins Eyewear's elegant lighting and integrated design approach delivers a superior customer experience.

It’s still hard for me to understand the function of narrative or “storytelling” in branding, because the way our brains are hardwired to work is that we seek explanations for everything whether or not they make any rational sense or not. We simply have to make up stories to explain our experiences, and when we can’t we let other people (even people like Donald Trump) make them up for us, or default to an explanation of something simpler than the real important thing were trying to explain. This is a fascinating thing to look at, and also very difficult to measure and manage.

Experience though, is a bit different in this regard. As a branding person back in the day, I, like everyone else, constantly faced the fact that 50% of what we did worked, we just didn’t know which 50% it was. I always wanted to do better than that and so was attracted to the idea of really listening to people’s experiences, which was something design teams rarely did. Not only are teams like those of Ms. Jewell and Ms. McDonald asking questions and listening more, we’re now able to measure the unconscious parts of experience that drive behavior. This is increasingly feasible today in the built environment, especially in retail. We are increasingly able to measure customers’ movement, emotions, buying decisions, and experience with rich data gathering networks built on lighting and heavily influenced by lighting. This is changing everything about retailing, and lighting is at the center of it in many important ways.

This presentation showed me that while it remains difficult to quantify and measure, the “brand” experience as we understand it is heavily dependent on lighting in new and surprising ways – not just in commercial consumer products but with community organizations and other service based businesses as well – and that it’s impossible to separate design efforts into different disciplines. They all have to work together just like each ingredient in a great recipe prepared by a master chef must harmonize. It behooves lighting designers to know more about branded environments, and how this definition can be expanded to include improving customer experience in the built environment. It’s also important for clients and their design teams of architects, interior designers, graphic designers, and even IT people to learn about the transformative power of lighting and to integrate it fully into their design projects.



Disrupt No More: Connect!

I don’t know about you, but when we call it “disruption,“ I’ve had my fill, thank you. My generation thought color TV was an amazing innovation, mainly because we could watch Walt Disney’s Wonderful World of Color on Sunday night over our TV dinners (another amazing “disruptive” innovation, but we didn’t call it that at the time). Speaking only for people my age, over the last five or six decades we’ve had to adjust to too many disruptive changes and have had to learn too many complicated new interfaces and business models over and over – we are weary! We can barely handle yet another effin’ “paradigm shift!” And some people in our parents’ generation are sometimes beyond weary, often simply giving up and refusing to acknowledge even the internet. Of course that doesn’t mean we’re miserable with all this new stuff, especially when we can Skype with the grandkids whenever we want. Call that innovation “connection” and I’ll take all I can get. New paradigm? I’m chill dude, download the app! I suspect even younger generations also become weary of change that’s too rapid and resist it on some deep level. And disruption is a very bad word when it comes to ISIS, increasing inequality, and global environmental degradation – things that technology both exacerbates and remediates. We need a theory that most of us can relate to – one that guides us in how to build and grow sustainably, without trashing the planet. We no longer need a theory that set out to describe how big companies fail and became misinterpreted to justify techno-narcissism, ruthlessness, greed, and the unprecedented rapid concentration of wealth and power in a few very large global corporations.

I’m an avid reader of New Yorker and have also greatly enjoyed Harvard Business Review when I’ve availed myself of the rather expensive publication, usually from an airport newsstand- there’s just something about flying for business that makes me want to read business publications. So it was with some surprise that I encountered Jill Lepore’s devastating takedown of Harvard Business School professor Clayton Christensen’s disruption theory in New Yorker in June of 2014: The Disruption Machine: What the Gospel of Innovation Gets Wrong. Lapore is one of my favorite writers, I hold her in high esteem as a historian. She took a lot of flack for crossing invisible but definitive academic boundaries of “collegiality” by attacking someone in another field, but it’s evident in her critique that she (correctly in my opinion) sees the theory as insufficiently grounded in historical example and in some ways kind of an attack on an historical approach itself. I’ve also spent plenty of time in Silicon Valley environments where disruption theory was gospel and clearly (among the usual things like colossal ego, techno-narcissism, and general corporate cluelessness) regularly contributed to very bad decisions, not to mention the vaporizing of untold sums of both real and Monopoly money.

At the time, the attack generated a lot of buzz, as it questioned the conventional wisdom of a “disruption” theory that never seems to have entertained much real questioning, like, say, the idea of climate change for instance. It’s a complex topic and deserves far more careful and extensive analysis than I’m able or willing to provide. But what should be a productive debate can also devolve into an irrelevant academic battle, so I have held back on commenting until I had reviewed as much of the debate as I thought I should. Fortunately I have no academic territory to defend viciously, and can make what I think to be a helpful contribution because I’ve realized there’s another approach we can use to replace disruption. In my cheekiness I will call it Connective Innovation. I’m not yet attempting quite a rigorous theory per se, and expect to get some blowback about that. And in the spirit of its own construct, it’s not really mine either:  “Combinatorial Innovation” and “Combinatorial Creativity” are ideas that have been around for some time. I just thought the word “combinatorial” had too many syllables, and that there’s more value to connecting than to simply combining. Besides I’m a marketing guy.

If you’re unfamiliar with the theory as articulated by Christensen, it’s what Lepore calls a “retrofit” of Austrian economist Joseph Schumpeter’s theory of “creative destruction” and basically says that products or services take root initially in simple applications at the bottom of a market and then relentlessly move up market, eventually displacing established competitors. It originally came about as a way to explain the failure of businesses, which in itself is a fascinating and very useful line of inquiry: failure analysis. To me though, and to Lepore, it falls far short of explaining why and how businesses and innovations actually succeed, and unlike her, I can offer something that might look like an alternate explanation. I will spare the blow-by-blow arguments about the theory and refer you to one flustered and not very compelling rebuttal by Christensen and other articulate commentaries about the debate here and here.

A persistent problem with the theory is how it’s been applied and distorted as a meme. Lepore rightly points out that one uptake on disruption – the “embrace failure” meme –  has contributed to pervasive narcissism in a kind of get-rich-quick startup mentality, where burning through lots of other people’s money by failing in a string of startups has become a badge of honor in certain circles. Applying “disruption” as it’s frequently understood – just like blindly applying an idealistic, bogus “free market” approach –  to things that really shouldn’t be disrupted or “free-market” in the first place, like education and government, is not always productive and often ruinous. This of course is not Christensen’s fault, nor does it mean his over two decades of careful research have not yielded important contributions to a theoretical approach to economic and business problems, strategy, and organization, but it doesn’t help his case. Freud has been widely discredited for generations but still made important contributions.

Another little thing that’s pretty hard to ignore is that he claimed back in 2007 that the iPhone wouldn’t succeed because it wasn’t sufficiently “disruptive” to fit the theory. He’s also recently claimed that Uber also is not truly disruptive: I’m not buying that for a minute, sorry. Most humans think of these two amazing emergences (what do we call them really?) as the ultimate in “disruption,” for better and for worse, because they’ve basically turned our world upside down – like so many other things, some technology, some social responses to or surprising uses of technology. Maybe Christensen just has a branding problem and should call his theory something else, like Reincarnation Theory. But there’s a serious disconnect between the theory and how most of us experience the world. Missing the boat on things like iPhone and Uber pretty much discredits the utility of this theory for me, and a heck of a lot of other people too, right?

To be fair, I will tease out a few things that I think Christensen does get right or that I find useful. For starters, the observation that when many big companies make decisions that turn out to be very bad in retrospect they were entirely rational, good decisions at the time, based on the only available information. Those at the helm of mighty industries are still fallible humans and likely to commit the same kinds of errors in reasoning that all of us are, it’s just a matter of scale. Daniel Kahneman and others have detailed these reasons in the very useful constructs of behavioral economics: framing, anchoring, the endowment effect, What You See Is All There Is, and several others. I think that Christensen’s differentiation between incremental innovations and truly transformative ones can be useful in the context of things like global climate policy and energy efficiency where we really need big changes quickly, not business as usual until it’s too late. And I like the general idea of studying business in an academic environment and developing theories that help us to manage large complex enterprises better. But the record is mixed on these efforts because so much depends on behavioral theories that are new and often largely untested.

When reading HBR and other similar journals, I do see an effort to solve HR and management issues with a keen sense of social values and a humanistic approach, but (and this may be read as a harsh assessment) as for academic research in business in the U.S. at least, I sometimes wonder whether it’s perpetually doomed to expectations of simply producing theories and algorithms that allow funding organizations and their leaders to get rich on predicting the stock market. Disruption theory as a franchise could be seen this way perhaps. To be fair, we’re nowhere near being able to understand the complex behavior of organizations as well as we need to, and economics is still routinely called the “dismal science” for good reason. On the other hand, for me, behavioral economics does show signs of being far more useful as a theoretical framework.

Connective Innovation theory supplies a better framework for understanding what’s happening to us today and for making better decisions. As I define this theory, it says that true transformative innovation happens when technologies (and what I’ll call for now “social movements”) combine and connect in surprising and organic ways. This isn’t anything really new at all. It’s been demonstrated brilliantly by James Burke in his TV Series “Connections” and book of the same name. Futurists Frank Diana and Gerd Leonhardt also have a lot of interesting things to say about it.

According to Ms. Lepore, for a theory to have utility and some predictive power is needs to meet these conditions: “[it] must serve both as a chronicle of the past and as a model for the future; the strength of a prediction made from a model depends on the quality of the historical evidence and on the reliability of the methods used to gather and interpret it; and historical analysis proceeds from certain conditions regarding proof.“ Let’s examine it in light of these conditions.

Regarding “predictive power,“ I have a problem with the word “prediction” in the first place, preferring “forecasting” or something less certain, only because what’s most useful is a set of plausible futures, not a single one we’re likely to mistake for the certainty. But we should be able to correlate the plausibility of a proposed limited set of outcomes with whatever actually comes to pass, thereby testing the validity of our process – this seems logical. As a chronicle of the past, there is massive evidence over the history of innovation arising as a connection between two or more separate technologies or social movements. My reading of history on this account includes James Burke’s work and many histories of technology too numerous to mention, but the pattern is very clear here. As a model for the future, I believe we need to study the past for evidence because, as Robert Wright says “there’s nowhere else to look.” What’s happening today with the explosion of IoT technology is exhibiting many of the same patterns ( even with the same exact companies, like General Electric) that electrification followed at the turn of the 20th century and the decades immediately afterwards – the similarities are particularly useful as patterns of plausible development. As for quality of historical evidence and reliability of interpretation, this is always a problem with history because secrets can be told only when those who kept them are dead. But we have access to much data about technology and its evolution, even through archaeology, as what remains of civilizations is frequently mostly tools, so we can make connections and interpretations from those. The technology explosion that started to gain momentum even before the Industrial Revolution necessitated careful record keeping and increasingly detailed data sets, so we can compare, say, output in textile mills that switched from water to electrical power when certain technologies combined. As far as evidence goes, we have a vast amount of historical evidence on connective innovation compared to the fossil record on evolution – in fact it may be useful to see connective innovation as a big part of a theory of cultural evolution, something we’re still working out in how it connects to biological evolution.

For practical purposes, there are many things I like about the theory. It takes into account the organic and accidental nature of innovations, which aren’t entirely predictable or foreseeable. One of the things most of us understand but are likely to forget is that many transformative innovations are total accidents, involving unforeseen uses of a technology devised for something else entirely. In an example close at hand, virtual reality was invented for gaming and is now finding a huge range of other applications in global communications and design that were completely unexpected. Much of the accidental nature of innovation also occurs when technologies combine unexpectedly, as when railroads began to see that the telegraph could help to coordinate traffic, shipments, and schedules, which were enabled by the adoption of standard time zones ( a social organization-based non-technology innovation but one crucial to the global economy). Factoring into the accidental nature of innovation is the fact that anyone can play, including you. Many innovations are stumbled upon by regular folks – shepherds or farmers or accountants – who happen to be in the right place at the right time and put two and two together. This phenomena plays into the American Dream of the independent inventor genius, but in a larger sense it’s a function of the network effect. We’re seeing the value of big social networks increasing as more people join them- this also increases the individual contributions to the network that ach person makes, which definitely include accidental innovations. Another part of connective innovation theory I like is that not every important innovation is a technology, with wires or engines or batteries or lenses. Some of the most profound and transformative innovations are forms of social organization that are facilitated by technology to be sure, but could (and did) exist independently of it, like the sharing economy, otherwise known as“barter.”  And I think what I like most about a theory of connection is that – surprise – its about connections. Everything is far more connected than we realize, and digging into the history of any innovation anywhere inevitable leads to the history of, well, pretty much everything else. No one is quite the master of this narrative like James Burke, who is always capable of connecting the dots in very short order between, say, ant language, saltshakers, Queen Elizabeth, and electroshock therapy.

In the New Yorker article, Lepore concludes a point with “…the world’s not getting any better but our devices are getting newer.” I love this line for how it sums up so much, not least the prevalence of planned obsolescence in our industrial culture, which has a fascinating history that I’m happy to fill you in on if you’ll indulge the digression.

In his excellent history Electrifying America: Social Meanings of a New Technology, David E. Nye explains that some of the unintended and unforeseen results of industrial electrification included both rapid labor turnover due to the insufferable boredom of performing singular repetitive tasks on production lines, and the emergence of planned obsolescence to satisfy a buying public suddenly faced with a tidal wave of undifferentiated consumer products. In order to sustain demand, products now had to be new all the time, and the basic approach of American industrial design was born- make a solid core of the systems that did the real work (motor, drive train, brakes, etc.) and retool the package every year. Thus what was a rational economic response at the time became utterly embedded in our entire industrial infrastructure, and has resulted today in movements like William McDonaugh’s Cradle to Cradle, which seeks to rethink and replace this outmoded paradigm. Expectation of relentless newness in products and everything else is at the heart of our value system in the United States. Planned obsolescence eventually came to have more sinister connotation, that we’re getting cheated by companies, and wasting energy and materials (both true) but back in the 1920s and 30s it was widely promulgated as the a pillar of our economy. It was intimately bound up with our idea of the modern world – completely enabled by electrification –  and the idea of human destiny as inevitable “progress,“ primarily technological, towards increasing complexity.

Robert Wright in his book Nonzero: the Logic of Human Destiny, talks about what he calls moral progress. Wright is a fascinating thinker, and has done much to bring me around to, of all things, reconciling science and religion. I started on this path with E.O. Wilson’s assertion that science explains religion better than the other way around. But after reading Wright’s Nonzero, I’ve evolved my understanding of morality considerably. Because I was never religious in the first place, like so many people, I struggled to understand what might be termed “the logic of human destiny,” which is undeniably a mouthful (or a “mind-ful,” as it were). But with help from folks like Darwin, Wilson, and Wright, once I begun to see that adaptive mechanisms like reciprocal altruism are common in many species and have a certain mathematical logic that confers advantage to the species that practice them. So suspending your individual needs briefly for the good of the group ends up being a better survival strategy, even for the individual. The fact that we have a big brain capable of long term memory helps us to execute this strategy, and in fact these two adaptations are probably co-dependent. Reciprocal altruism is basically what all religions teach, and herein lies the basic architecture of morality, something we now need to focus on bigtime if we’re going to live in a world we don’t ruin first in our headlong rush to survive. We can use connective innovation as a template for how to do this, provided that one of the things we connect most with is the planet itself.

As Lepore says, “people aren’t disk drives.” A theory that attempts to explain how technology companies were born and died in the latter half of the 20th century is certainly important and valuable, but the spectrum of data upon which it relies is a bit too narrow to build a useful theory. In my world, I work with people in charge of making and managing the built environment and its components and systems. Many of us routinely make difficult decisions with long term, scale effect consequences that can be hard to predict or even understand. A theory like connective innovation that says “seek combinations and connections” rather than “lets blow up another global industry today” helps us to overcome inertia, bust professional and organizational silos that hold back innovation and progress, and see things in social, political, behavioral, experiential, economic, and yes, moral terms instead of just purely technological terms. Today we have an unprecedented number of classes of new technologies as well as new (or if you know your history, sort of recycled) forms of social organization, like crowdsourcing and the sharing economy to play with. If you can bring yourself to see it as one huge board game (which is kind of is in a way – it’s easily “gamified”) it can just look like we have more to work with to solve our problems, even if the tools we have are sometimes part of the problem. By factoring social, environmental, and moral elements as drivers we can begin to learn how to isolate the important problems to solve first, then use our increasingly rich technology to solve them. I’ve actually begun to gamify this myself, in something I call “Net-Zero Nonzero Combo,” a charrette framework for solving problems with Smart City solutions.

Perhaps the fate of “disruption” rather proves one point of “connective innovation” which is that a theory created to explain one specific behavior gets used to explain and justify a whole lot else, and in the end, not that well- by attacking it w bring a connective process to improve and – we could of course say “disrupt”– it! I know that Silicon Valley technocrats like Steve Jobs, Marc Andreesen, Larry Page, Sergey Brin are not Dr. Evil but genuinely see that changing humanity with technology by better design is not only possible but necessary. Of course it’s better to make voting, mail, transportation, housing, agriculture, energy, and everything else more efficient and ultimately more available for everyone. But the misapplication of a somewhat obscure and academic theory has contributed to our becoming way too focused on technology alone, with its ensuing disruption and chaos, and not on seeking equilibrium and stability, the real goals of sustainability, and ultimately, our survival. Biological evolution drives cultural evolution, not the other way around, (not yet).

 

Yellow to Blue: the Recent History of Lighting and Color

Yellow to Blue: the Recent History of Lighting and Color

As lighting is getting more energy efficient and the cost of new lighting technology is dropping, light is getting bluer. This is often generally not a good thing, for many reasons. But the most important is that (IMHO) we may deeply hardwired for warmer light at night, and the juggernaut of blue light technology seems to be happening in total ignorance of and opposition to our basic physiology.

Read More

Customer Experience Sucks Globally- Change That and Win

Customer Experience Sucks Globally- Change That and Win

Ten years ago I contributed to a book on customer service: Customer Service Delivery- Research and Best Practices, in which I decried the increasing tidal wave of spam and vanishing privacy that was hitting us then as a result of emerging “marketing automation.” Today I’m unhappy to report that the situation is no better, in fact it’s much worse. While IT and automation makes transactions easier and “frictionless,” companies in general have used it to cut costs, de-personalize experience, and relentlessly overload us with a massive invasive, irritating, at times simply immoral tsunami of global spam that shows few signs of slowing down. 

Read More

The Net-Zero Nonzero City

The Net-Zero Nonzero City

We hear a lot lately about Net Zero buildings that produce as much energy as they consume, and as we extend this idea to cities, energy efficiency will continue to be a crucial part of our future. NonZero refers to the human part of the equation: social arrangements, especially cities, where a nonzero sum (or win-win) game is possible, where cooperation, exchange, innovation, trade, environmental, balance and sustainable growth are facilitated and accelerated. 

Read More

Sustainability Beyond Energy

Sustainability Beyond Energy

On balance, the green building movement and energy efficiency programs in the U.S. in have been successful over the last decade or so. Sustainable design practitioners often forget important non-energy benefits, but these benefits are crucial to improving efficiency and attacking climate change. Now we have the technical tools to make the process of integrated design much more deliberate, but the hard part is mustering the social and political will to do so.

Read More

Lighting and Global Warming Part 2 – A Mindset for Future Practice

Lighting and Global Warming Part 2 – A Mindset for Future Practice

In this post I make suggestions on how deal with this on a personal level if you work in lighting or the building industry. There are many industries you can work in, jobs you can do, and spiritual paths you can take to apply your skills and energy to this global issue – for me the building industry and lighting in particular present great opportunities to make an impact because buildings use so much energy in their construction and maintenance. And with buildings you must act locally, get your hands dirty, and make tangible things that are visible, complex, problematic, social, transformative, and an integral part of our natural habitat- the built environment. 

Read More

Lighting and Global Warming- Part 1 – Facing Uncertainty, Utopia, and Apocalypse

Lighting and Global Warming- Part 1 – Facing Uncertainty, Utopia, and Apocalypse

In the light of the upcoming Paris negotiations in climate change, it's obvious that our efforts may fall short and that we'll need to redefine the problem without giving up, losing hope, or abandoning current successful efforts. How can lighting and building design professionals learn to make a difference in climate change? 

Read More

How and When Will Smart Actually Feel Good?

In the past few months, despite my better judgement, I've become a bit more optimistic about overcoming tech fatigue and somehow lumbering ahead to a fabulous future. In the process of producing and speaking at several events focused on Smart CIty and Smart Lighting, I've expanded my perspective, through a number of different lines of inquiry. 

Read More