Richard Evans: Innovation in the arts


President of EmcArts Richard Evans’s perceptive October 2012 talk on innovation in the arts translates, in many respects, to just about any sector of the economy.

I want to suggest that we are entering — and we’re pretty far into — a new era for the arts. …  The old era was structured for growth. … [and] the value proposition that was sustained during that time was about excellence and scarcity.

There was a quality of programming that many had not had access to before. But, at the same time, there was the sense that we were taking artists from our communities and bringing them together in ensembles of unprecedented quality, and then selling them back to our communities at the highest ticket prices that those communities could afford. There’s an element of scarcity that is almost inevitably linked to the excellence that I think was underlying this first phase of the arts.

It’s a phase that I think is now decisively over. And I think we’re now having to be structured for resilience, not for growth. … It means that organizations have to develop much higher levels of adaptive capacity — the capacity to change effectively and respond to circumstances.

One of the stories he tells is about the Memphis Symphony Orchestra:

If you know the mission statements of most American orchestras, they tend to be something like: We’re going to perform the best music in the world, to the most people we can, at the highest possible quality, as often as possible.

Well, the Memphis Symphony Orchestra went in a different direction. They now say that their job is to create meaningful experiences for the citizens of Memphis, through music.

It’s that little shift that makes music the means and not the end that really opened doors in the community for the orchestra.

When they looked at how they might implement that mission, they found all sorts of partners in the community. Because instead of going out and saying, “Could we play our music in your space?” they were saying, “How can we work with you to address the issues that you have?”

And one of his touchstones is Ron Heifetz on technical and adaptive challenges:

Ron Heifetz - technical and adaptive
H/t Institute for the Study of Coherence and Emergence

Conference design: Modes of Explanation

The thoughtful preparation of Modes of Explanation (May 21-25 in Paris) sets a high bar for conference design:

  • All participants will be required to submit either a video of their contribution or a full version of their paper at least three weeks in advance of the event.
  • The videos will be published on line and transcripts prepared.
  • All papers will similarly published and OCR’d.
  • Word clouds will be prepared of each presentation and of each paper designed to show at a glance what the contribution is about.
  • Contributions will be gathered into two hour discussion sessions where the panel of discussants are expected to have read and/or watched each other’s contributions.
  • Presentations in the sessions will be limited to 5 minutes each so as to maximize the time for interactive discussion.
  • All sessions will have their own Wiki site and will be recorded for transcription and future access.
  • Attendance at Modes of Explanation will be limited to 80 and discussions are expected to involve not more than two dozen participants each.

Bulleted format is mine; h/t Ray Ison

Design, broadly understood, is about purposeful action, the process of creating change in the world around us.

If design is to be thus understood — as a fundamental domain of inquiry, alongside the sciences and humanities — then our conception of the designer must evolve as well.

Béla Bánáthy wrote on this topic in Designing Social Systems in a Changing World (1996):

A discussion on model making should include studying the model maker. The model maker is the designer who constructs a representation, a model of the systems to be created. Based on a contemplation and interpretation of the work of Lippitt (1973), a few of the salient attributes of the model maker are introduced.

Confidence and a certain amount of courage are required to transcend an existing system or state and attempt to create a conceptual model as the design solution. Making a model of a desired future system requires a degree of confidence in one’s assessment of the present and in one’s commitment to a vision of the future, and it calls for willingness to take risk with conviction.

Situational sensitivity implies seeing things that others might not perceive, seeing things that are not obvious, stretching one’s perceptual powers to capture and feel more about the situation than would ordinarily be the case, and becoming tuned in to the complexity of emerging design solutions.

Flexibility calls for adjusting quickly to emerging developments in the design situation, extending the boundaries of the design inquiry, experimenting with various design solutions, abandoning old assumptions and trying out new ones, and — most importantly — adding new dimensions to the solution rather than merely adjusting the old.

Tolerance for ambiguity and uncertainty means tolerating a certain amount of disorder in bringing meaning to contradictions and dynamic complexity, living with the uncertainty of emerging design solutions, and withstanding the pressure for immediate or quick solutions.

Moving back and forth between analysis and synthesis interactively is an essential requirement of the model builder. In synthesizing, we identify, create, combine, and enfold different elements into a holistic framework of the emerging process and structure of the system. At the same time, we constantly analyze what we create. Moving between synthesis and analysis is a key requirement of effective model building.

Managing design takes place in an environment of dynamic complexity that is unpredictable, ambiguous, and unique. Designers in such contexts cannot rely on standard procedures. They have to manage design with the use of methods tailored to the design situation while they seek solutions in the flow of all the processes that are manifested in social systems.

(Bold emphases are mine; see also: Science, humanities, design: The three cultures.)

The whole-of-community action web

whole-of-community action webThe nebulous notion of sustainability begs for visual representation.

I collect favorites, and a few years ago I worked with designer Andrew Fuller to create a series of posters, “representations of sustainability,” for a show at Portland’s Sea Change gallery.

Yesterday’s post on Valerie Brown and the book Tackling Wicked Problems reminded me of this poster featuring her diagram: the whole-of-community action web.

From Brown’s book Leonardo’s Vision: A Guide to Collective Thinking and Action:

The potential for whole-of-community learning rests on a paradox. Any lasting change in any one community depends on matching transformation in the surrounding society. Any lasting change in the larger society depends upon transformational change in its constituent communities. …

The whole-of-community action web suggests six forms of connection are needed to hold communities together as they learn and change.

In the diagram, these six are: whole-of-community commitment, internal community integrity, inter-community partnerships, external community alliances, integrated forward planning, and future-directed action.

Valerie Brown on wicked problems

[I posted this week on the design approach to wicked problems advocated by Nigel Cross and the “clumsy” approach described by Steve Rayner. For comparison, here is a reprint of my April 2011 P&P post on transdisciplinarity and wickedness, the method outlined by Valerie Brown in Tackling Wicked Problems: Through the Transdisciplinary Imagination. I reviewed the book in the September 2011 issue of Ecopsychology.]

Science-as-usual will not solve complex problems. This is the starting point for Tackling Wicked Problems: Through the Transdisciplinary Imagination, edited by Valerie Brown, John Harris and Jacqueline Russell.

The idea that problems might be “wicked” was developed in a 1970s paper (“Dilemmas in a General Theory of Planning”) by Horst Rittel and Melvin Webber:

The search for scientific bases for confronting problems of social policy is bound to fail, because of the nature of these problems. They are “wicked” problems, whereas science has developed to deal with “tame” problems.

“Tame social problems have for decades been the province of systems analysis,” writes historian Howard Segal, reviewing Tackling Wicked Problems (“Solutions beyond systems analysisNature sub. req.):

Such thinking holds that if a team of experts fails to solve a social problem, one can simply add more experts until a solution is found. … The book shows that systems analysis is still applied, but its utopian aspirations have faded.

Tyndall Centre for Climate Change Research founding director Mike Hulme, also writing on Tackling Wicked Problems, uses his review to highlight the narrow emphasis on the STEM disciplines (pdf):

The political orthodoxy in liberal Western democracies is that investment in the STEM disciplines – science, technology, engineering and maths – provides the most assured basis for future economic vibrancy and social well-being. … There are few books which mount such an audacious challenge – one that is both theoretically and practically informed – to the presumptions of the positivism underpinning the STEM ideology.

I’ve been a fan of Brown’s work since her book, Leonardo’s Vision: A guide to collective thinking and action, and the edited volume, Social Learning in Environmental Management: Building a Sustainable Future.

She introduces the new book in a podcast that, recorded by the publisher: Earthscan, is re-broadcast by Radio Ecoshock (along with an interview with political scientist Thomas Homer-Dixon, and talks by philosopher of science Jerome Ravetz and open source legal advocate Eben Moglen).

From Brown’s talk (transcribed without ellipses):

Wicked problems are embedded in society, and since you can’t cure a society, you clearly have to take some other approach to it.

Wicked problems have many causes. They involve multiple interests. They evade any simple definition, because all those interests would have a separate definition. So we need a new way of thinking about science before we can resolve a wicked problem.

I’m going to propose that we need a new science, and I call it transition science. Resolving a wicked problem calls for transition science based on collective decisions. It’s multi-causal, involving multiple interests.

If you have multiple interests, each has its own knowledge construction. Individual knowledge is based on personal, lived experience. Local knowledge is based on shared community events. Experts contribute from a particular box that they are trained in. Strategic knowledge is the organizational agenda. Holistic knowledge gives focus and vision.

The knowledges reject each other. One of the troubles in working to bring them together is that there is a grain of truth in each. Individual knowledge can be biased. Local knowledge can be merely anecdote. Specialized knowledge can speak in jargon. Strategic knowledge is by definition often self-serving. And holistic knowledge is often dismissed as airy-fairy.

Looking at transition science, I’m going to suggest that it’s a nested set – the relationships between these knowledges are many and varied. But for constructive decision-making, you’re going to need them all.

Clumsy solutions to wicked problems

[I wrote last time about Nigel Cross’s sense that design thinking can help tackle wicked problems. That post reminded me of this June 2010 one I wrote for P&P, “Clumsy Responses to Wicked Climate Problems,” reprinted in full below.]

Steve Rayner Clumsy Climate

More and more, climate change is referred to as a wicked problem, one that is “difficult or impossible to solve because of incomplete, contradictory, and changing requirements” (Wikipedia).

From Horst Rittel and Melvin Webber‘s “Dilemmas in a General Theory of Planning,” a 1973 paper that describes wicked problems:

The Enlightenment may be coming to full maturity in the late 20th century, or it may be on its deathbed. … By now we are all beginning to realize that one of the most intractable problems is that of defining problems (of knowing what distinguishes an observed condition from a desired condition) and of locating problems (finding where in the complex causal networks the trouble really lies). …

As distinguished from problems in the natural sciences, which are definable and separable and may have solutions that are findable, the problems of governmental planning — and especially those of social or policy planning — are ill-defined; and they rely upon elusive political judgment for resolution. (Not “solution.” Social problems are never solved. At best they are only re-solved — over and over again.)

Clumsiness is conceived of as a response to wickedness. From Marco Verweij and coauthors in “The Case for Clumsiness,” the introductory chapter to the book, Clumsy Solutions for a Complex World: Governance, Politics and Plural Perceptions:

The term ‘clumsy institution’ was coined by Michael Schapiro (1988) as a way of escaping from the idea that, when we are faced with contradictory definitions of problem and solution, we must choose one and reject the rest.

I reprint the slide above (with permission) from Steve Rayner‘s 2006 presentation “Wicked Problems: Clumsy Solutions.” Rayner is director of University of Oxford’s Institute for Science, Innovation and Society, and coauthor on “The Case for Clumsiness” (as well as on The Hartwell Paper and the Oxford Principles on Geoengineering).

From Rayner’s notes (Jack Beale Memorial Lecture on Global Environment):

Now, there are challenges of course for these approaches. The media and voters expect policymakers to fix problems. It is a hard sell for a policymaker to say well, you know, I’m going to wait for a clumsy solution to emerge, I’m going to make sure all the voices get heard. Policymakers, generally speaking, demand scientific bottom lines for decision-making. Although we know from the knowledge utilisation literature that they rarely use them to make policy, they still demand them. And scientists are committed to improving knowledge, so often hold out unrealistic expectations to policymakers that they will produce knowledge that will cut the Gordian knot of a wicked problem.

And we also have the hammer problem, which is the success of rational choice theory in solving more straightforward problems which exacerbates expectations for their appropriateness for application to wicked problems. We often hear that there are claims that there are no alternatives to the application of benefit cost analysis and other kinds of rational choice tools.

So there are hard challenges ahead for a clumsy approach. But, a very good friend of mine, Shiv Visvanathan, pointed out recently that democracy is not merely a design problem, it’s a challenge to the imagination. So I’m offering clumsy solutions to wicked problems as a challenge to the imagination. I believe that embracing clumsiness moves us from techniques for selecting among well defined alternatives towards looking for new skills for creating imaginative solutions.

(See a larger version of the Clumsy Climate Strategy slide.)

Science, humanities, design: The three cultures

science, humanities, design: the three cultures

[Update: 20AUG2013 posted revised table at top. See update here.]

Are there designerly ways of knowing? Does the design mode of inquiry depend on distinct methods and values? How might design ability be developed?

Nigel Cross, emeritus professor of design studies at The Open University, has been exploring these questions since the 1970s, in a long list of publications.

I adapted this table from the writings of Cross and from a similar table by Béla Bánáthy in Designing Social Systems in a Changing World.

From Cross’s 1982 “Designerly ways of knowing” (pdf):

From the RCA [Royal College of Arts] report, the following conclusions can be drawn on the nature of ‘Design with a capital D’:

  • The central concern of Design is ‘the conception and realisation of new things’.
  • It encompasses the appreciation of ‘material culture’ and the application of ‘the arts of planning, inventing, making and doing’.
  • At its core is the ‘language’ of ‘modelling’; it is possible to develop students’ aptitudes in this ‘language’, equivalent to aptitudes in the ‘language’ of the sciences – numeracy – and the ‘language’ of humanities – literacy.
  • Design has its own distinct ‘things to know, ways of knowing them, and ways of finding out about them’.

I identified five aspects of designerly ways of knowing:

  • Designers tackle ‘ill-defined’ problems
  • Their mode of problem-solving is ‘solution-focussed’
  • Their mode of thinking is ‘constructive’
  • They use ‘codes’ that translate abstract requirements into concrete objects.
  • They use these codes to both ‘read’ and ‘write’ in ‘object language’.

From these ways of knowing I drew three main areas of justification for design in general education:

  • Design develops innate abilities in solving real- world, ill-defined problems.
  • Design sustains cognitive development in the concrete/iconic modes of cognition.
  • Design offers opportunities for development of a wide range of abilities in nonverbal thought and communication.

From a 2009 interview in Rotman Magazine:

How can design thinking help managers tackle ‘wicked’ problems?

Part of the difficulty in dealing with wicked problems is you don’t know when you’ve got the right solution. There’s no definitive, correct solution, and there’s no definitive, correct view of the problem. They’re called wicked problems because they haven’t been tamed; they haven’t been structured or well-defined; they’re not cut and dried; and they don’t yield readily to a single optimal solution. This is an important point because much of training in management and reasoning is about finding the ‘right’ solution. In dealing with wicked problems, it’s not about optimizing, it’s about ‘satisficing’, as Herbert Simon describes it in his book The Sciences of the Artificial. Simon, himself greatly accomplished in science and economics, realized that when faced with these sorts of intractable problems, you can’t actually optimize a solution, but rather find one that is satisfactory.

Another thing we’ve realized about these wicked problems is that the problem and the solution have to be – and indeed do – develop together. The understanding of the problem begins to develop as soon as you try to develop ideas as to how you’re going to solve it. It’s a mistake to set out what the problem is first and then try to find a solution. Be prepared for the fact that the problem and the solution will co-evolve, and you’re going to go to and fro between the two of them.

The final point about wicked problems is that constructive or design thinking is indeed the best way to tackle them.

Thoughts on design as a “third culture” or on this table?

[Update: see also — Bela Banathy: Salient attributes of the designer.]

Robert Rapier: Oil resources are not reserves

I keep catching misleading statements about U.S. oil, so this post by Robert Rapier, one of the “top ten of best read Oil Drum posts in 2012,” seems worth a CC-enabled reprint.

The Difference Between Oil Shale and Oil-Bearing Shale

People are often confused about the overall extent of U.S. oil reserves. Some claim that the U.S. has hundreds of billions or even trillions of barrels of oil waiting to be produced if bureaucrats will simply stop blocking development. In fact, in a recent debate between Republican candidates contending for Gabrielle Giffords’ recently vacated House seat, one candidate declared “We have more oil in this country than in Saudi Arabia.” So, I thought it might be a good idea to elaborate a bit on U.S. oil resources.

Oil production has been increasing in the U.S. for the past few years, primarily driven by expanding production from the Bakken Shale Formation in North Dakota and the Eagle Ford Shale in Texas. The oil that is being produced from these shale formations is sometimes improperly referred to as shale oil. But when some people speak of hundreds of billions or trillions of barrels of U.S. oil, they are most likely talking about the oil shale in the Green River Formation in Colorado, Utah, and Wyoming. Since the shale in North Dakota and Texas is producing oil, some have assumed that the Green River Formation and its roughly 2 trillion barrels of oil resources will be developed next because they think it is a similar type of resource. But it is not.

Although the oil in the Bakken and Eagle Ford is being extracted from shale formations, the term shale oil has been used for over 100 years to describe a very different resource. This has led some to confusion over the differences between current production in North Dakota and potential production in Colorado. The oil in the Bakken and Eagle Ford formations actually exists as oil, but the shale does not allow the oil to flow very well. This oil is properly called “tight oil“, and advances in hydraulic fracturing (fracking) technology have allowed some of this oil to be economically produced. (For more details, I discuss resources, reserves, fracking, shale gas, and oil shale in some detail in my new book Power Plays: Energy Options in the Age of Peak Oil).

The estimated amount of oil in place (the resource) varies widely, with some suggesting that there could be 400 billion barrels of oil in the Bakken. Because of advances in fracking technology, some of the resource has now been classified as reserves (the amount that can be technically and economically produced). However, the reserve is a very low fraction of the resource at 2 to 4 billion barrels (although some industry estimates put the recoverable amount as high as 20 billion barrels or so). For reference, the U.S. consumes a billion barrels of oil in about 52 days, and the world consumes a billion barrels in about 11 days.

Like the Bakken, the Eagle Ford formation in Texas consists of oil (and natural gas) in tight formations that is being accessed via fracking. The amount of technically recoverable oil in the Eagle Ford is estimated by the U.S. Department of Energy to be 3.35 billion barrels of oil.

Without a doubt, these two formations are a major factor in the current resurgence of U.S. oil production. But the Green River formation is the source of talk of those enormous oil resources — larger than those of Saudi Arabia — and it is a very different prospect than the tight oil being produced in North Dakota and Texas. The oil shale in the Green River looks like rock. Unlike the hydrocarbons in the tight oil formations, the oil shale (kerogen) consists of very heavy hydrocarbons that are solid. In that way, oil shale more resembles coal than oil. Oil shale is essentially oil that Mother Nature did not finish cooking, and thus to convert it into oil, heat has to be added. The energy requirements — plus the fact that oil shale production requires a lot of water in a very dry environment — have kept oil shale commercialization out of reach for over 100 years.

Thus, while the U.S. might indeed have greater oil resources than Saudi Arabia, U.S. oil reserves (per the BP Statistical Review of World Energy) are only about 1/10th those of Saudi Arabia. The distinction is important.

Summarizing the Definitions

To summarize, let’s review the definitions for the important terms discussed here:

Oil resource — the total amount of oil in place, most of which typically can’t be recovered

Oil reserve — the amount of oil that can be recovered economically with existing technology

Oil shale — sedimentary rock that contains solid hydrocarbons called kerogen (e.g., Green River Formation)

Shale oil — the oil that can be obtained by cooking kerogen

Tight oil — liquid hydrocarbons that are obtained by hydraulic fracturing of shale formations (e.g., Bakken Formation and Eagle Ford Formation)

Conclusion: Resources are not Reserves, and Tight Oil isn’t Shale Oil

It is pretty clear that at current oil prices, developments in the tight oil formations will continue. It is not at all clear that even at $100 oil the shale in the Green River formation will be commercialized to produce oil, although a number of companies are working on it and will continue to do so. Oil shale is commercially produced in some countries like Estonia, but it is primarily just burned for power.

In order to commercially convert the oil shale into oil, a more energy efficient method of producing it must be found (or, one would have to have extremely cheap energy and abundant water supplies to drive the process). I have heard from multiple industry sources that the energy return for producing oil from oil shale is around 4 to 1 (lower than for oil sands production), and that is before refining the oil to finished products. At this sort of energy return, oil sands will continue to be a more economical heavy oil option.

Thus, my prediction is that despite having an oil shale resource that may indeed be far greater than the oil resources of Saudi Arabia (I don’t think I have seen an estimate of Saudi’s total oil resources), the reserve will continue to be close to zero for the foreseeable future because there are still many technical hurdles to overcome to realize a scalable, commercially viable process.

Finally, I would say that if a commercially viable process for shale oil production from the Green River formation is developed, the environmental blowback will be enormous. The production of shale oil is more energy intensive (i.e., has higher carbon emissions) than for the oil sands, it has a high water requirement in a dry climate, and it is potentially a huge new source of carbon dioxide emissions.  The environmental protests that would arise in response to a growing commercial shale oil operation would make the Keystone XL pipeline protests pale in comparison.

Sherry Turkle: AAAS talk on sociable robotics

The theme of this year’s American Association for the Advancement of Science (AAAS) meeting is the “unreasonable effectiveness” of science.

Friday’s plenary talk was by MIT social scientist and psychologist Sherry Turkle on “sociable robotics.”

 The idea of some kind of artificial companionship has become the new normal. … But I think that this new normal comes with a price. Because for the idea of artificial companionship, for the idea of the teacher robot, for example, to become our new normal, we have to change ourselves.

And in the process, we are remaking human values and human connections. We change ourselves, even before we make the robot. We think we are making robots, but we are remaking people. …

What are we talking about when we’re talking about robots? We’re talking about our fears of each other. Our disappointments with each other. Our lack of community. Our lack of time.

In these conversations, I hear exhaustion. Because getting these things back, seems beyond us. I hear hopelessness about investing in people to make them fit to take care of each other. To take care of us, eventually.

People go straight from their reservations about a health care worker who didn’t finish high school, to a dream of inventing a robot to care for them, just in time.

Again, we live at the robotic moment, not because the robots are ready for us, but because we are counting on them.

When we assume artificial companionship, it changes how our children grow up, it changes how we treat each other, it changes how we think about caring for each other, across the generations.

Stafford Beer: 1973 Massey Lectures

At the time of this 1973 talk, cyberneticist Stafford Beer had just returned from Chile, where his Cybersyn project with the Allende government had ended with the military coup d’état.

“The target is to transform the whole of industrial management, and to make Chilean industry fully effective in one year,” wrote Beer. Instead, his staff destroyed the computer tapes. (See “Designing Freedom, Regulating a Nation” [pdf], by Eden Miller, author of the award-winning Cybernetic Revolutionaries.)

The CBC has recently opened its full audio trove of annual University of Toronto Massey Lectures, including these six talks by Beer, in whose precisely regulated speech patterns I sometimes hear bullet points.

From lecture 1:

Remember these aspects of our work together so far:

  • A dynamic system is in constant flux, and the higher its variety, the greater the flux.
  • Its stability depends on its net state reaching equilibrium, following a perturbation.
  • The time this process takes is the relaxation time.
  • The mode of organization adopted for the system is its variety controller.

With these points clearly in our minds, it is possible to state the contention of this first lecture with force and, I hope, with simplicity.

Here it goes:

  • Our institutions were set up a long time ago.
  • They handled a certain amount of variety, and controlled it by sets of organizational variety reducers.
  • They coped with a certain range of perturbations, coming along at a certain average frequency.
  • The system had a characteristic relaxation time, which was acceptable to society.
  • As time went by, variety rose, because the relevant population grew and more states became accessible, both to their population and to the institutional system.
  • This meant more variety reducers were systematically built into the system, until today our institutions are nearly solid with organizational restrictions.
  • Meanwhile, both the range and the frequency of the perturbations have increased.
  • But we said just now, the systemic variety has been cut — this produces a mismatch.
  • The relaxation time of the system is not geared to the current rate of perturbation. …
  • Hence our institutions are in an unstable condition.

From lecture 4:

The brain is a finite instrument that mediates all our experience and is therefore limiting.

As a personal aside, let me say that I am more interested in the fact that I could not recognize an angel if I met one — because my brain does not have requisite variety — than I am in the illegitimate scientific argument that angels don’t exist because I have not recognized one yet.

Returning to the main argument about the limitations of the brain, I have argued that we as individuals are the unwitting victims of a cultural process which very drastically delimits variety for us.

In the first place, our economic environment points to an increasing use of science and technology in what is allegedly the service of man, but which I contend takes this ‘service’ in a false sense.

As a result, we stand — and the innocent legatees of our policies in the developing nations yet more vulnerably stand — to be exploited by whoever wields the power of science to technocratic ends.

In the second place, the instruments of variety constraint turn out to be education and the communications media, both of which we culturally suppose to be variety amplifiers. This belief is as delusory as the belief that we can fully know reality.

It’s entirely possible to take corrective action about all this — not the biological limitations, but the societal constraints. To do so requires that the people themselves take control of the use of science through their democratic processes.

This means furnishing them and their governments with new channels of communication, and a new kind of education system, and a new kind of publishing system.

Why are these recommendations necessary? The answer is that the necessary attenuation of variety produces in us a mere model of the world. And in so far as we wish to control the world, whether as citizens or as individuals within a personal environment, our powers of regulation are cybernetically constrained by the model we hold of what needs to be regulated.

Our civilization has led us to a manifestly dysfunctional model. Then, we must equip ourselves to revise it. The power to do this, we certainly do possess.

See also: “Climate regulation and requisite variety.”