The Concept of Art

Is there a universal way to define the word “art”?

If by “universal” you mean a way to define art that captures everything that is a work of art and excludes everything that is not a work of art, as opposed to a definition of art that is universally accepted, the answer is yes.

At least, there are ways of trying to define art universally, although none has been conspicuously successful or satisfying to all. However, that doesn’t mean we can’t learn something by making the attempt and by reflecting on attempts made. There was quite a lot of that in the last half of the twentieth century, though in the last couple of decades the focus has shifted to artistic value – why art matters – and issues involving aesthetic properties. Beauty, especially, has gotten a good deal of (in my view) needed attention.

There are other ways to form a conception of something than by defining it, especially if you think of a definition as a statement of individually necessary and jointly sufficient conditions. In the last century, however, George Dickie seemed to have found one. He said, roughly speaking, that something counted as an artwork just in case it was offered as a candidate for appreciation by someone institutionally authorized to do so. The institution in question was usually thought of as the “art world,” a rather loose but not utterly incoherent arrangement of practices, beliefs, values, authorities and other things we associate with an ongoing and purposive social activity. Continue reading

Why did Nietzsche call Kant a “theologian in disguise”?

Nietzsche meant that Kant established the validity of Christian morality by making philosophical arguments that didn’t rely on Christian beliefs.

In The Gay Science, Nietzsche writes:

Kant wanted to prove, in a way that would dumbfound the common man, that the common man was right: that was the secret of this soul. He wrote against the scholars in support of popular prejudice, but for the scholars and not for the people. [§193.]

Kant held that all rational persons have an a priori understanding of the basic principles of morality. These consist of duties, both to oneself and to others, and above all the duty to respect rational agents. Most persons, however, do not understand that morality is a priori, and their moral commitments are therefore vulnerable to corrosive skeptical criticism. In The Metaphysics of Morals Kant formulates the ultimate standard for moral judgment, namely universalizability, and establishes the rational necessity of morality.

Continue reading

The History of Literary Criticism

I recently read Joseph North’s Literary Criticism: A Concise Political History (Harvard, 2017). North is very good at characterizing what the field settled into since the beginning of the 2000s, once the glory days of postmodernism were over. The main result of the latter was to play down concern with the aesthetic dimension of literature in favor of teaching students to think about literature from a “historicist” and “contextualist” – code for “political” and often crudely ideological – perspective.

That much is clear, but the question is, how did it happen?

According to North, the Baby Boom generation that accomplished the turn from aesthetics to politics relied on a misrepresentation of the so-called “practical criticism” of I.A. Richards, which they encountered in the form of the American school of New Criticism. This, the younger generation believed, was an essentially conservative enterprise that encouraged political passivity by isolating literary value from the wider world. The rebellion against it culminated in the New Historicism, which paved the way for post-colonialism, queer theory, disability studies, and the rest.

Continue reading

A Modest Proposal

UC Berkeley’s law school recently changed its (unofficial) name from “Boalt Hall” to “Berkeley Law” because John Boalt, after whom the building that houses the law school was named, had “said racist things,” in the words of Dean Erwin Chemerinsky.

But how does referring to the school with the name of someone who not only said racist things, but also owned slaves, solve the problem?

The city of Berkeley is named for the notorious slave-owner Bishop George Berkeley (1685–1753), an Irish philosopher who took the opportunity provided by some years spent in Rhode Island to buy some of his fellow human beings and force them to labor on his plantation.

Continue reading

The Case Against Democracy

In his book Against Democracy (2017), Jason Brennan presents a formidable amount of empirical evidence to the effect that the more someone is involved in politics, the worse he or she becomes as a person.

  • The more you are involved in political debate (especially as the representative of a group or ideology), the less likely you are to reach reasonable conclusions. Participation increases people’s tendency to ignore facts that don’t support their position, to argue in manipulative and deceptive ways, to adopt extreme views, and in general it makes people more biased and less reasonable.
  • The more active you are in politics, the less likely you are to talk with people whose views run contrary to your own. In fact, you may reach the point where you are unable even to imagine a point of view other than your own. As a result, the more active you are in politics, the less good you will be at doing what politicians are supposed to do: see enough sides of an issue to craft and sell a compromise.

Continue reading

Progressivism and Disciplinary Power

Over the last decade or so, “progressive” activists have exhibited a desire to regulate the personal behavior and values of their fellow citizens. Language, attitudes, expressions, gestures, feelings, and even thoughts are to be policed, with the aim of enforcing principles of conduct established by self-appointed “experts” in the workings of racism, sexism, classism, ableism, and other injustices.

Foucault’s concept of disciplinary power might conceivably help us think about the rise of illiberalism on the progressive left. There are at least as many differences as there are similarities, however, between disciplinary power and the regulation of personal behavior pursued by activists today.

What is disciplinary power? Foucault’s view was that after the Enlightenment had undermined the moral authority of religion, modern societies developed professional and academic disciplines that purported to use scientific methods to acquire empirical knowledge of human behavior. These sciences – psychology, sociology, economics, anthropology, criminology, medicine – established how human beings normally behaved under various circumstances.

Theoretically, “normal” meant “average.” But in practice, “normal” was implicitly taken to mean “good” or “ideal.” This, Foucault argued, made possible a form of oppression that was characteristic of liberal democratic societies: individuals “internalized” the norms established by the disciplines and regulated themselves accordingly. In this way, social scientific “experts” in human behavior played the role of the earlier religious and moral authorities.

Continue reading

What do Horkheimer and Adorno mean when they say: “myth is already enlightenment”?

Myth is “already” enlightenment because myth and enlightenment have something in common: the desire to control nature, rooted in a fear of nature and aggression.

In myth and religion, human beings tried to control nature (at least to the extent of sustaining the right amounts sun, rain, and fertility) by propitiating the gods, offering them sacrifices and other signs of devotion. Myth and religion understood nature in personal terms, seeing in forces such as sun, rain, and wind the recognizably human qualities of purpose and desire. Knowledge of reality was acquired by means of inspired mystical experiences, and passed down through the generations as authorized by tradition.

The central principle of the Enlightenment, on the other hand, was the sovereignty of reason. Reason is the highest source of intellectual authority, and its findings trump religion and tradition. Reason establishes the value of religion and tradition, but neither religion nor tradition can evaluate reason. As Immanuel Kant put it in the Critique of Pure Reason (1781):

Our age is, to a preeminent degree, the age of criticism, and to criticism everything must submit. Religion through its sanctity, and the state through its majesty, may seek to exempt themselves from it. But then they arouse just suspicion against themselves, and cannot claim the sincere respect which reason gives only to that which sustains the test of free and open examination. (Critique of Pure Reason A.xii.)

Continue reading

The Question of Being

In Being and Time (1927) Heidegger says that he wrote the book in order to “reawaken the question of the meaning of Being.” It’s important to pay attention to all the words in this phrase.

Heidegger’s question is not the one that Leibniz asked, namely “Why is there something rather than nothing?” That question asks for an explanation – a cause, a sufficient reason – of the fact that anything exists at all. Heidegger, on the other hand, wants to know what it means to say that something exists. Or rather, he wants to ask what it means. That implies that the meaning of Being is not well-understood, which, Heidegger thinks, is significant in ways we don’t sufficiently appreciate.

Asking “What is the meaning of Being?” is paradoxical in that the use of the word “is” in the sentence implies that the meaning of Being is already known. The meaning of a sentence such as “The scarf is blue,” for example, seems clear enough.

That isn’t to say that “is” is completely unambiguous. According to logicians, “is” as used in the sentence above is just one of four ordinary uses of the verb “to be”: predication (as in the sentence), identity, subsumption, and existence. Logicians use different symbols for each sense of “is” to remove the ambiguity, but in ordinary communication it is usually clear which sense is intended – if not from the sentence, then from the context.

Heidegger argues that each of the four senses derives from a more fundamental sense, although that sense has receded so deeply into the background that we’re not fully aware of it. Being, Heidegger says, means that something is present to to us in a way that makes sense. In Heidegger’s various formulations, it has been “uncovered,” “unconcealed,” “disclosed,” “granted,” or “bestowed.” To put it differently, Being is that which reveals. In Being and Time, that which reveals is our comportment – the “understanding of (the meaning of) Being” that’s embodied in our ability to differentially respond to the various entities in the world. For the later Heidegger, that which reveals is language, and changes in philosophical language over time track the variations on the Platonic understanding of Being that constitutes much of our spiritual history.

Continue reading

On Self-Reliance

In “Self-Reliance,” Emerson says that “[w]hat I must do … not what the people think … may serve [as the rule] for the whole difference between greatness and meanness.” Later he adds that “we have not chosen [our occupations] but society has chosen for us.”

If the problem is that society has chosen our occupations for us, then the remedy would seem to be for us to choose for ourselves. But that’s not quite right, because the choice of an occupation – or rather, a vocation or calling – is importantly different from the ordinary exercise of free will.

In “Spiritual Laws” Emerson writes:

I say, do not choose; this is a figure of speech by which I would distinguish what is commonly called choice among men, and which is a partial act, the choice of the hands, of the eyes, of the appetites, and not a whole act of the man. But that which I call right or goodness, is the choice of my constitution; and that which I call heaven, and inwardly aspire after, is the state or circumstance desirable to my constitution; and the action which I in all my years tend to do, is the work for my faculties.

The “work for my faculties” is my vocation or calling. The choice of vocation is not accomplished by “what is commonly called choice” but is rather “the choice of my constitution,” i.e. of my whole self. A vocation chosen by one’s “whole self” is one that defines oneself, and in that way it is more like an unconditional commitment than a decision, relative to the circumstances, to take one course of action rather than another. Committing oneself to a calling is something like yielding to necessity.

Continue reading