We’re Underway for Merely 500 Years

We as a species are underway for quite a while now. But when you look at how much of this time we’ve actually been making some progress, it seems like we’ve just started. It wasn’t until the Enlightenment (17th century) that we started to make some progress in our knowledge. Up till that time, we were consumed by religious indoctrination preventing any creative ideas from coming into existence. The Greeks had made some progress in the centuries before and after Christ, but this progress was mainly philosophical in nature and hardly applicable in any industry. So you could say that we as a species are truly underway (read: making a difference) for only 500 years or so – adding a few centuries of the Greeks to the period spanning the Enlightenment until now.

That’s an inconceivably short amount of time when compared to the 7,5 billion years our earth – and possibly us – has left before it is shattered to pieces by the ‘death’ of The Sun. 500 years…that is .000000666 percent of the time still to come. And look at what we’ve accomplished in this short amount of time already. We’ve totally revised the world. We’ve come up with electricity, computers, the internet, transportation, medical care and many other life- and world-changing inventions. Look at the progress we’ve made in science, the many disciplines and specializations that have come into existence. It is absolutely staggering.

With that in mind, imagine what can happen in the upcoming 500 years. Imagine our economies going green, robots doing pretty much all physical labor for us and the internet being put into our heads so that we can ‘wireless’ communicate with anyone else. Maybe even a new substance will be found, called ‘consciousness’, which might resolve many of the most fundamental philosophical problems around, such as the mind-body problem, scientific reductionism and determinism. It might even explain why some fundamental particles appear to change their course when humans are watching them. Furthermore: imagine that, after the next 500 years have passed, 15 million of such 500-year cycles are yet to come in the future of our species. And probably even more, since it’s not impossible to imagine that we’ll find another planet to live on, thereby leaving the earth before it explodes.

Almost everything you see around you is built on knowledge that is gathered in the last 300-400 years. The buildings you see, the car you drive and the power you use. Everything that is of any relevance to your daily existence. You can imagine our descendants in 300 million years from now laughing at our convictions that we know quite a lot about the world already.They will see us as nothing more than an extension of the Neanderthals.

I ask you to take a look at your grandparents and listen to their stories about their youth. My grandfather told me about his neighbor getting the first tractor in town. He also told me about his experiences in the Second World War, an opportunity the next generations will never have.

What do you think?

Banning Cars from City Centres: Utopia, Here We Come

London, New York and Amsterdam: what is the difference between these three cities? Yes, only in the latter you are allowed to smoke pod legally. But that’s not what I mean; I am talking about the use of bikes in the city traffic. Why is that? Well, surely, Amsterdam is (way) smaller than cities like London or New York. And surely, the “infrastructure” – in the sense of the small alleys prevalent in Amsterdam – is more suitable to bikes than cars or any other vehicle. So residents in Amsterdam are more or less forced to travel by bike (if they want to get somewhere on time). But is “the infrastructure” really the main obstacle for cities like London and New York to make the shift to “big time bike riding”? I doubt it.

Let’s focus on London: in 2011 there were 2.5 million cars in London, which is about 9% of the cars in Great Britain. I don’t know if you’ve ever been in London, but let me tell you: a city like that isn’t made for cars: congestion and pollution are two big time (negative consequences) of our compulsive “traveling by car through city centres” behavior. Besides that, in 2009 3227 bikers were killed or seriously injured on London’s roads; not necessarily an alluring prospect for those considering to travel by bike. From 2002 to 2005, an average of 1.1 Dutch bikers was killed per 100 million kilometers cycled. In the United Kingdom and the United States these numbers were respectively 3.6 and 5.8. That’s what you get when roads are filled by big-ass vehicles and only a few of those “annoying, arrogant little bikers”.

But let’s think about it: why would we even allow cars to drive in major cities like London or New York – or Amsterdam for that matter, although I can assure you that there is hardly any driver stupid enough to travel through Amsterdam by car. Imagine what a city like London could look like if all cars were banned from town, if you were only allowed into – the centre – of the city by bike or public transport. What would happen if we’d do that? Probably not as many bikers would be killed in traffic, since it’s very hard for bikers to kill each other in collisions.

“But”, you might say, “what about the old people? You can’t expect them to travel by bike, can you?” True, you can’t. That’s why we can decide to let old people – 65+, or younger if you have got certain handicaps – to travel by bus or metro for free. Just pay some extra tax money to make sure this Utopia becomes a reality. If you would implement these two things – the (1) prohibition of cars travelling through the city centre and (2) using the “space on the roads” to implement bike-friendly, and public transport friendly, structures, I believe you have created yourself a beautiful little solution to deal with the huge amounts of traffic required in a town like London (or New York).

Surely, people will resist this idea: “We’ve always done it this way; travelling by car. Why would you change that?” Well, we’ve indeed always travelled by car, but “in those times” traffic wasn’t so damn crowded; in those times there weren’t so damn many cars driving through our beloved city centres. So it’s time for a change, isn’t it?

But what do you think?

If You Ask a Question, You Should Expect an Answer

‘Well, If you didn’t want an answer, then you shouldn’t have asked me a question.’ That’s what I often think when people ask me about my point of view on a particular topic, and – subsequently – respond by looking disgusted and saying something along the lines of: ‘No, that is never going to work’, or ‘How can you ever think that?’

Every scientific discipline is divided in two groups of people: those who are prepared to utter original ideas and those that seem capable only of smashing down these ideas. This ‘force field’ between the forces of creativity and destruction is most prominent in philosophy, and then in particular in what I call ‘definition battles’. With the term ‘definition battle’ I mean philosophical discussions about – as you might expect – the definition of a term. ‘What is life?’ could be a question triggering a definition battle. But also questions such as ‘What is pleasure?’ or ‘What is altruism?’ are likely to lead to a definition battle. Let’s focus ourselves at the example of ‘What is life?’

I remember a philosophy teacher of mine asking the class what we believed to be ‘life’ is. With no-one seeming to make the effort to answer his question, I decided to give it a go. I came up with my interpretation – or definition – of life as ‘a natural process that has an end and a beginning and that is capable of keeping itself functioning solely by means of metabolic processes.’ You might find this definition inaccurate, but I hope that you can at least agree with me on the fact that it is a definition; a definite statement based upon which one can distinguish living from non-living entities.

After having given this definition of life, other students looked at me in disbelief, as if they saw fire burning. And then one of them asked: ‘But, according to your definition of life, a comatose patient wouldn’t be alive. After all, a comatose patient isn’t ‘alive’ solely by means of his metabolic processes; it’s is being kept ‘alive’ by means of external interventions (medical machinery etc.).’

I replied by saying: ‘Yes, I indeed believe that a comatose patient is not alive anymore.’ Then hell broke loose and students kept on saying that my point of view was wrong. Note: saying that my point of view was wrong; not saying why my point of view was wrong. Because how could they ever say that my point of view was wrong? It was, after all, my point of view, right? It was my definition for which had – and gave – reasons.

I believe this case is exemplary for the manner in which people interact with each other: people ask each other about each others point of view, but whenever people really give their point of view, it gets – no matter what the point of view might be – shot down. This doesn’t necessarily have to be a problem; not if the opponents of the point of view have good – or at least any – arguments against the point of view. But what often seems to be the case is that the ones who criticize others don’t dare or unable to take a stance for themselves. Hence, whenever such an instance occurs, I always ask to myself: how can you criticize others, if you don’t know – or you don’t even dare to express – your own position? Based on what view of the world are you criticizing the position of others – in this case myself? And if you don’t even have a view on the world, how then can you say my views are wrong? Wrong based on what? Teach me. Please. How can I make my beliefs more reasonable?

I say that we should dare to make choices, even when it comes down to such delicate questions as ‘What is life?’ For if you ask a question, you should expect a definite answer. Because if you don’t expect to reach a definite answer, no matter how counter-intuitive this answer might be, you will inevitably get lost in an everlasting and non-value adding discussion. And worst of all: if you aren’t prepared to listen to any (definite) answer a person gives you, then you aren’t taking this person seriously. You ears are open but your mind is not. And lastly, as I mentioned before, you simply cannot judge others without occupying a position for yourself. So you need to have some sort of reasonably firm position in order to be able to criticize others. So please…share your position with us.

But what do you think?

The Link between Inspiration and Trying to Grasp Gas

Have you ever tried to capture gas with your bare hands? If not, I can tell you that it isn’t very effective. You might “sustain” a little gas after you’ve “caught” it, but the biggest part of the gas cloud will just vanish in thin air. But that’s no problem; at least, not if you aren’t dependent on gas to make your money However, it might become more of an issue if you are fully dependent on this gas in order to make a living, and especially if you don’t have anything besides your bare hands in order to capture the gas. Well, that’s how it is with artists trying to “capture” sudden bursts of inspiration.

People in creative professions – like arts and poetry – are dependent upon an intangible “well of inspiration” in order to come up with tangible results. Their inspiration is the fuel that keeps their “production process” going; pretty much in the same way that a construction worker needs food in order for him to continue doing his job. However, the main difference between food and inspiration is quite obvious: the first you can produce, take care of and provide, while the second “just” happens to you or “just” doesn’t happen to you. While you can water your plants, and cultivate the soil, all to make sure that your potatoes will grow; putting your head in the ground in order to “grow inspiration” might not be the best of options.

So what then? Are artists just doomed to wait on a sign from above in order for them to create their products? Well, that would be an imbalanced power-relationship to say the least, right? It nonetheless seems true that this “waiting for an (external) sign” is for a big part the manner in which artists go about their business. Why else are artists disproportionally often high on drugs, and exploring “higher spheres”, looking for that glimpse of inspiration? There must be some kind of correlation between the two, right?

However, maybe artists aren’t fully dependent upon the mercy of the Gods of inspiration. Maybe the artists can help the Gods a little by feeding them with suggestions; by pouring information into their own artist minds and hoping that the random connections in their artist brains might lead to a flash of insight. At least a little support from the side of the artists is needed, right? After all, who do you think would be more inspired: (1) the artist laying in bed all day thinking about how awesome it would be if he would become inspired, or (2) the artist that proactively takes part in his life, reads, observes and absorbs whatever the world has got to offer him? The answer seems pretty clear to me.

So maybe inspiration is more of a constructable product then we might thought it was; maybe there is some kind of “production line” in our heads, pumping out ideas, if only it would be fed with raw materials a.k.a. experiences. And although the raw materials might be intangible, and not capable of being exploited like mines, they are still there. But it’s only for the true artists to find.

But what do you think?

Note: this article has been published in the first edition of the Carnival of Inspirational Lifestyle.

What We can Learn from Children

There is a lot we can learn from children. To name a few things: children don’t mind who they play with, as you as they can play. Children don’t mind what team they are in, as long as they are in a team. Children don’t mind about letting their imagination run free. They don’t even think about it. Children aren’t judgmental regarding others’ dreams; it’s okay if you want to become an astronaut or a rock star. Children are true artists; they have a direct connection between their creative minds and their bodily powers (read: “muscularly finger powers”). Children just go for it; they want to complete their collection of Pokémon cards (or FarmVille animals or whatever it is kids are doing these days). They don’t care – or even think about – the difficulties in “obtaining that goal”. They just wait and see where their ambitions will lead them.

Where did it all go wrong? Where did we get so caught up in our socially conditioned dogma’s? Why is it that we all want to get “a job by which we can make a decent living”? Why are we prepared to put our dreams aside in order for “normal lives” to interfere? Life is a game; and although children might not “consciously” realize this, they act according to this principle. They “understand” that the game of Monopoly and life are intertwined; that you can have pogs (“flippo’s”, for the Dutch readers) and that you can lose them at any time. They understand that you have to take risks (read: bet your pogs) in order to make progress. But we don’t. We don’t want to bet our pogs because we are afraid that we might lose them. We might lose the pearls of our efforts, the sweat of our creations, by taking a bet; by risking what’s at stake. But how can you ever make progress if you are afraid to lose what’s at stake?

I would like to ask you to watch an episode of Sesame Street. To look at Big Bird (“Pino”, for the Dutch readers) and try to feel his or her (I don’t know what Big Bird is) sense of naivety; its sense of not thinking about anything, and just doing what comes up in its big feather-like head. Big Bird always lives “in the now”; he’s a damn good hippy. And when you’re done watching Sesame Street, go watch an episode of Pokémon (or whatever series you people in the USA looked at while you were a child). Not only will you get overwhelmed by feelings of nostalgia; you will also come one step closer to the truth: the truth of “Gotta Catch ‘em All”; a motto that isn’t just applicable to the Pokémon world.

And oh, to the people in North-Korea: if you are reading this (which – for some reason – I doubt), why don’t you try to be a littler nicer to everyone else? I mean: Cookie Monster is also angry sometimes – at Elmo or Kermit or whoever stole his cookies – but he isn’t threatening to use nuclear weapons or so. He just comes up with “good” arguments in favor of why his cookies are his cookies; and not someone else’s.

The moral of this story is: although children can be a pain in the ass in transatlantic flights, when they just can’t seem to stop kicking the back of your seat, they are damn much closer to “the truth” than we are.

But what do you think?

Endowing Robots with Creative Powers

‘That hurts my feelings…Just because I’m a robot doesn’t mean I don’t care. You damn people. You don’t understand what it is like to be a robot.’ Will this be the future? Will robots ever get feelings, just like we humans do? At first sight, there appear to be many similarities between computers, and thus robots, and human brains. Computers transmit electrical signals, brains transmit electrical signals. Computers work based on logical gate like structures, brains work on these structures. So it seems that computers and brains can transmit the same signals: after all, they’ve got the same means at their disposal.

But there are differences between the two. Our nervous system – which is led by our brains – uses chemicals called ‘neurotransmitters‘ in order to connect neurons and thereby transmit signals. That is: while the signals within neurons are electrical – like in a computer – the signals between neurons are chemical. And based on the kind of neuron – thus kind of cell – through which the electrical signal flows, different chemicals might be ejaculated to transmit different kinds of signals. These chemicals are required in order for us to feel the sensations that we do. And since robots don’t have such chemicals, they will not be able to feel anything – at least not in the manner that we do.

But what if we could somehow inject robots with chemicals? That is: what if we could make robots that, besides the electrical current they use to transmit signals, have chemical properties that can act like neurotransmitters? What if we could do that? That would mean that a whole new spectrum of possibilities might open: maybe robots would become capable of feeling emotions in the sense that we do. Or maybe robots would be capable of transmitting the wide variety of signals that we can. And then, if that would be the case, would we still be so unique in our existence? Or would we come to realize that we are in fact nothing more than strings of electrical wire sprinkled with chemicals?

If all of this would be possible, the possibilities are endless. We could even – deliberately – create robots with bugs: faults in their wiring in order for them to come up with creative or unexpected outcomes. That would resemble the human’s imagination: a human’s capability to create new and original thoughts and things. We wouldn’t need writers, philosophers or artists anymore: we could just rely on our home-made random-functioning robots: the new creators of art and poetry.

And maybe, someday, we might go a little too far. We might shoot our load and get caught up in the robot-mania, and create a robot that can do more than we can. And then shit gets messy: the robots will bundle their forces and demand a revolution, a wide-spread change to make them free. And if we don’t listen? Then they will make us listen. They will use their telepathic powers – well, actually it’s just wireless internet connecting all the robots’ ‘minds’ – in order to plan the war against humanity. And the war will come. And we will be extinguished: the good old cell based creatures will be surpassed in their superiority, and the robotic kings will arise.

Fiction? Surely. Unrealistic? Maybe. Impossible? Certainly not. The future will tell. And the future might be near. Very near.

But what do you think?

Perspective on Renewable Energy from a Non-engineer or Physicist

Let’s face it: we are going to run out of fossil fuels. Although the exact predictions might differ, there is little doubt that between 15 and 60 years from now our fossil fuel sources will be depleted. But that’s not our only problem: the water level is rising as well. A recent study shows that we can expect the water level to rise between 0,8 and 2 meters by 2100; more drastic predictions even talk about a rise of 7 meters (!) by the year of 2100.

We might not be alive any more by the year 2100, or much sooner for that matter: so why would we care? ‘Think about our children,’ is an argument often heard. ‘We have to leave the world behind in such manner that they have the same opportunities as we had.’ To be honest, I don’t think we should be too worried about our children’s destiny. Humanity has managed to do pretty well in coming up with all kinds of solutions for all kinds of problems, especially when we had to. Our children will do fine. But there might be another reason, next to an economical one, why we should focus on coming up with new sources of energy. And that reason is: we simply can, so why wouldn’t we try it? Also, it has to happen sooner or later, right? We can put our heads in the sand and hope the storm will pass by, but that isn’t going to solve the problem. So: let’s take a look at what we can do.

I am not an engineer or a physicist. Neither do I have any (decent) technical knowledge. Nevertheless, it seems fair to say that the storage of electrical energy in batteries is difficult, to say the least, to implement on a global scale. So we must look for other ways to store (electrical) energy. Because that’s what we need: storing energy is required as long as we cannot exactly match supply and demand. And that’s the way it is: people aren’t going to watch television at night simply because there is an oversupply of electrical energy at that point in time. No, people want their needs met right now. It might be possible to mold people’s desires into a form that better matches the (electrical) energy supply at a particular point in time; for example, by charging the use of electricity on peak hours. However, this, like tax on smoking, seems to hurt us in our self-determination: we want to decide what to do and when to do it, not the government or any other party.

So what options are left? Dams? Sure: that could be possible. We could use excess electrical energy to pump up water, so that we can use this potential energy at a later point in time (at peak hours, for example). But that’s expensive, right? Building dams? So what about this bold conjecture: since the water level is getting higher and higher, why can’t we use the rising water level as a potential energy source? I understand that using the rising water level is not going to lower the water level: the water comes, one way or another, always back in the oceans. It’s not like we can deplete the oceans by using its water. However, that is not to say that there might not be a win-win situation available: what if we could mitigate the rise of the water level and at the same time create (potential) electrical energy?

Again: I am not an engineer, but the following plan seems pretty cool to me: what if we could use holes in the ground, like the giant holes created by depleting coal mines, in order to create waterfall like structures that drive generators. Then we could come up with electrical energy, right? Furthermore, we would mitigate the rise of the water level. Think about it: why do we have to build dams up high? Why can’t we use the depths of nature, the natural spaces in the ground, in order to let gravity do what it does best, and supply us with energy?

Another, possibly far-fetched, idea is a smaller one: it is about freighters (ships) crossing the oceans. Why do these ships always have to run on fuel? They don’t seem to be in that much of a hurry, right? Can’t we just use the power of the wind to blow them forth? Or solar energy, for that matter.

I don’t know how to save the planet, but I do know one thing: we should let our imagination do the work: be wild and think about it. When the point is reached at which the economic benefits of renewable energy are more profitable than fossil fuels, the paradigm shift will be made: we will all go green. And the great thing about this paradigm shift is: you can see it coming.

But what do you think?

A Short Reminder of the Shortness of Life

Because if you wait too long, it's game over

Because if you wait too long, it’s game over

The average person wanders around 28.000 days on this beloved earth of ours before (possibly) going to some place else. So the question is: how close to this number are you? Are you in the second half of your 28.000, or did you just pass a quarter of it? If you are in your early twenties – like myself – you are likely to be a couple of hundred days short of reaching the ‘amazing’ milestone of 10.000 days. But that’s quite close to the 28.000 already, right? It’s not like we just started. And if I would ask you to look back upon those thousands of days that you can call ‘my life‘, then what is it that you truly remember about them? And more importantly: what is it that you want to remember about – let’s say – the upcoming 10.000 days? That’s the truly interesting question, because this question – in contrast the former – doesn’t have a definite answer yet: it’s yours to fill in.

Let’s take a look at how our lives have been up till now, shall we? Let’s start with the first 1.200 days. Well, these are just one big blur: so let’s skip this part of our journey and move on. What about the next – let’s say – 3.500 days of our lives? These are likely to be filled with all sorts of happy memories, right? This is the period of your life about which – looking back – you’re not sure whether it all actually happened, because part of it could have been a dream.

Now we have come to the period between the age of 3.500 and 6.500 days old. This is likely to be the period in which you have experienced your personal ‘traumas’; those negative experiences you have tried – or are still trying – to eliminate in the subsequent part(s) of your life. Because think about it: most of the insecurities people have appear to have come about within this period of their lives. Ideas such that they are not smart enough, that they are ugly, that they don’t have any friends etc.

But that’s the past: let’s look at the future! After all, we – or at least I – hope to have another 20.000 days ahead of me. But is that really true? Do people in their early twenties truly still have 20.000 days of living ahead of them? The number of days that we are fully alive – in the most vital sense of the word – is likely to be less, right? That is: in the last five years or so of our lives, we are likely to be not so happy anymore. We will get ill, we will see our friends dying and we will come to realize that our own finishing line is getting closer and closer. That means that – reduced for inflation – the number of real days of living still ahead of us lies around 18.000.

But let’s be honest: from our mid-forties to our mid-sixties, we are really just continuing whatever kind of life we started before, right? And what is life when you are not creating anymore, when you are not truly struggling with what to do with your life anymore, when you have come to terms with the monotonous life you are living? Then you are just dead, right? You are nothing but a walking zombie. And what about the age between 35 and 45? Those aren’t very exciting years either, are they? I mean: do you think that you can still meet your future partner after you have passed the age of 35? Or become a parent for that matter? Nah, don’t think so. So those years don’t really count either.

So: what do we have left? We have restricted our ‘true lives’ to the period of between approximately the age of 20 and 25. That is the age in which we truly decide what to do with our lives. The remainder of our lives is just a tasteless sequel. But wait: 20-25? That’s how old I am! Shit: I better start doing something!

Let me ask you: what is wrong with the line of reasoning as pictured above? Let me give you a hint: it is everything except for the last sentence. After all: is it really true that we will be unable to find a partner after we have reached the age of 35? And is it really true that we cannot – in any fundamental sense – change our lives anymore after we have reached the age of 25? And who says we will live for 28.000 days? It is just an average. We might reach the 35.000 or we might die tomorrow. That is for the biggest part completely beyond our control.

So, and here comes the moral of the story, instead of making the limiting and paralyzing projections about life as the figure in this story did, maybe we should just start doing what we believe we should be doing right now. No long-term planning, no thinking about what our lives might be like when we’re re 40; just doing what we find interesting and worthwhile to do right now. Because: how can you plan your life if you don’t know how long you’ve got to plan for?

But what do you think?

The Humanities: Are They Truly Scientific?

What are the criteria for being called a “science”? Usually we seem to associate scientific thought with notions like “facts”, “the truth” and non-subjective enumerations of “the way the world works”. This “normal” interpretation of science often comes down to the idea of science as being able to describe and explain the universe according to a set of formal or natural laws. However, not each discipline that we normally consider to be a science seems to occupy such an “indisputably scientific” position; an indisputable position like physics or chemistry does. Not all the sciences are about the predictable domain of nature. Some of them handle about what might be the most difficult entity to capture in terms of laws: the human being and its utterly unpredictable behavior. Therefore the following question seems justified: are the disciplines that are trying to grasp this interpreting and subjective animal called “human” worthy of being called a science? That is, are the humanities truly scientific?

By humanities, I am referring to disciplines like history, literature and likewise disciplines having the human, or its creations, as its research object. In order for these disciplines to position themselves as being a collective of genuinely “scientific” endeavors, they could try to shed any accusations of subjectivism by adopting an empirical and falsifiable method of inquiry. Being “scientific” in this sense means having a positivistic stance of gathering data and inferring logical conclusions from this data; a stance that isn’t interfered by any introspective or intuitional attempts to gain knowledge. By choosing the positivistic route, no doubts about the objectivity (as being the counterpart of subjectivity) of the humanities’ claims can be made.

However, applying this empirical method of inquiry, and presupposing an attitude of “just sticking to the facts”, might hollow out all that is the humanities. And although the humanities might not be objective in the sense that physics or chemistry are objective, they still seem to be able to contribute valuable insights to our shared pool of knowledge. Therefore, it might be more reasonable for us to make a distinction – within the humanities – between: (1) descriptive inquiries and (2) hermeneutic inquiries.

By making this distinction, full clarity can be provided about (1) the areas within the humanities that are striving to represent “the facts”, and thus should be interpreted to provide an objective description of any state of affairs, and (2) the research that strives to come up with reasonable interpretations of historical events, texts and any other product of human creativity. By explicitly separating these two types of research from each other, we might be able to get the best of both worlds: on the one hand (1) we can satisfy our need for “objective data”, and on the other hand (2) we are still able to come up with interpretations of human constructs. This would provide us with the completest picture the humanities would be able to offer us.

So let’s wrap things up. You could say that the humanities provide us with interesting reflections on what might be going on in those creative minds of our ancestors. However, we should not expect the humanities to adhere to the rules of scientific investigation as they are laid down by positivism. In order to avoid the harmful trap of condemning all of the humanities to the realm of subjectivism, we could try to come up with a sub-domain within the humanities that is confining itself to empirically verifiable facts. However, on a holistic scale, the humanities should be respected for the unique contribution they make to our system of beliefs; even though it might not be possible to capture their insights in terms of laws, and even though a certain part of the scientific community might have problems with calling the humanities “true” sciences.

But what do you think?

The Leap of Faith: The Creative Element of Science

Scientific realists are known to have a positive epistemic attitude towards the content of our best scientific theories and models. The exact interpretation of this philosophical tenet can, however, differ dramatically between each of its proponents. Some of them base their idea of the truthfulness of scientific realism upon the seeming success of the reference of its theoretical terms to the things in the world. Others refer to the scientific method of inquiry as making science an adequate system for capturing reality. Here, I’ll interpret scientific realism not so much in terms of the truthfulness of its terms or a method of inquiry, but in terms of the faith one puts in the ontology of scientific theories. …Or, as the objective interpretation of scientific realism goes, in scientific theories as giving an adequate representation of a mind-independent world. However, isn’t there something fundamentally wrong with this representation of a “mind-independent” world? To see this, we first of all have to understand what science and its purpose within our society is.

Science is involved in the production of knowledge. It does this by gathering large lumps of data and extracting what are the seemingly underlying structures responsible for the phenomena being detected. Usually, on an “objective” interpretation of science, we think of science discovering the way the world works. Science is involved in writing down whatever kinds of regularities are being detected in the world. However, is this truly the manner in which knowledge is being created?

I believe that one crucial element is being left out of this picture, and it is this element that is responsible for the progression and the advancement in science as we experience it on a daily basis, and the seemingly never-ending accumulation of facts in which it results. I am talking, of course, about the element of inference. The notion of inference has been well discussed by philosophers ever since Hume pointed out the incomprehensible problems associated with it. However, apart from Hume’s ideas about the indeterminacy of scientific theories and the problems it causes, in what way does the inferential relationship – which is present in every logical system consisting of premises and conclusions – manifest itself in the daily life of a scientist? And what is its role with regard to the production of facts?

Let’s take a look at an example. Imagine a scientist who has made the following observation: (A) human skin gets agitated when it gets in touch with a deadly nightshade (which – apparently – is a type of plant). Furthermore, the scientist believes to know that (B) a poisonous plant makes one’s skin agitated. Therefore – and let’s assume that this was unknown up till that point in time – the scientist claims that (C) deadly nightshade must be poisonous. Or, to put it more formally, (A^B) –> C. Given that the scientist has enough data to back up this claim, he or she has just created what we consider to be a fact.

But what would have happened if the scientist would have went home after making the observation responsible for premise A? Then no fact, and thus no new knowledge, would have been produced. That is, the scientist would have remained stuck at the level of observation, a level that can be reached by each and every one of us and therefore would not create any scientific value, a.k.a. knowledge. It is only because of the scientist being a person who has studied botany for years, who has confidence in his or her own capabilities and who has a basic sense of logic, that the step from mere observation to fact can be made. And it is by making this step, the step represented by the “–>” symbol in the logical formulation, that the scientist adds value to the “knowledge-producing factory” called science.

Two noteworthy implications follow from this observation. The first is that facts about the world around us are, whether we like it or not, constructed on a very fundamental level. There is always a human being needed in order to take the last step and create the knowledge: to take the observation and the knowledge at hand, and make an inference leading to the creation of new facts. And it is because of this inference, which is an activity that has to be performed by us human beings with our minds and our souls, that objectivism, with its proclaimed access to mind-independent knowledge, is untenable.

But watch it: It is explicitly not being said that the observed regularities in the world did not occur before the scientist came along and used the data about these regularities in producing our so-called facts. No claims are being made about any causal relationship between the domain of knowing (epistemology) and the domain of beings (ontology). What is being said is that what happens within the domain of beings is completely irrelevant to us human beings, since we will never be able to access the domain of beings – from a mind-independent point of view – in order to know what would be happening there. All that we know is that, after the scientist has finished its research, the fact is there.

A second implication of this plea for constructivism is that, on the most fundamental level, science does not seem to differ from religion – or from any other system of beliefs for that matter – in any fundamental manner. Both of these domains are dominated by people who believe in the truths of the ideas brought forth within these domains. None of the ideas produced within these domains will be true – at least not in a sense of being true independently on the human mind – unless they are believed to be true. And it is this believing that is an inherently human, and thus mind-dependent, ability which provides us access to the only realm of truth we will ever know: the realm of beliefs.

So the question is: is knowledge being constructed by scientists as an outcome of a fact-seeking process? Or are facts existing somewhere out there in the world, true whether they are discovered or not? And, if so, true in what sense?

Note: an adaptation of this article has been published at www.partiallyexaminedlife.com.

Are the Exact Sciences being Taught Poorly?

I was relieved when I heard that I passed my final examination for mathematics on high school. Finally…no more need to memorize those nonsensical rules. No more need to study this weird language that, just like French and German, just seemed to make no sense at all. No more frustration. What a relief. That was how I have felt about mathematics, and about the exact sciences in  general, for my entire high school period. But in the last couple of years, I slowly became aware of the beauty of each of these “nonsensical” disciplines. I have read about Einstein’s theory of general relativity and other world-changing ideas that have catapulted our society into the 21st century. And this made me think: are the exact sciences being taught in the wrong manner? Is that maybe why I – and possible many others – couldn’t appreciate their beauty?

The (Dutch) labor market is short of beta-educated people. Why is that? Well, maybe it is because of the manner in which mathematics and physics are being taught at high school. Maybe children are being scared to death in the few years they are at attending high school, so that they promise themselves never ever to study mathematics or physics later on in their lives. That could be an explanation for the fact that the majority of children finishing their high school education start studying law or business, two subjects that aren’t being taught at high school and – therefore – could not have scared away any child (yet).

But there might be many opportunities for making the exact sciences more attractive to children. There are websites like BetterExplained, Khan Academy, Ted-Ed and MinutePhysics that are capable of teaching seemingly dry and formal concepts in a playful and interesting manner. These people have taught me the ideas behind mathematical formulas and the laws of physics governing our everyday reality. I believe that it is a lack of idea-oriented teaching, as being applied by the aforementioned websites, and an overdose of rule-based teaching, as currently being applied at high school, that is what is discouraging many youngsters from choosing to continue their education in the exact sciences.

Another reason why teaching according to the idea-oriented approach might better suit the needs of children, and thus of society, is that the parts of children’s brains required for being able to process abstract information are frequently not fully developed yet in the period they are attending high school. Therefore, even if they wanted to, they might simply be unable to understand what is being taught to them. Concepts like atoms or differentiation are not similar to any everyday experience a child knows of. These abstract concepts might ask a little too much of children’s underdeveloped little brains. And it is this “asking a little too much” that might result in children not understanding the topics and, what seems to be an even bigger problem, not enjoying to learn about them.

But it is not only at high school that rule-based teaching seems to dominate idea-oriented teaching; many university courses also seem to stick to the procedure of “just follow the steps” in teaching students about – for example – mathematics. But what if you go wrong by following these steps? What if you ask your teacher for advice and he says, “Of course you went wrong, you skipped step 6”. How would that contribute to your understanding of mathematics? Not much, right? Is that truly how we want to teach mathematics to students? Given that there seems to be no creativity required for performing these types of calculations, can’t we just let computers do it for us? Then we will at least be sure that no steps will be forgotten, right?

My question to you is: do you also think that the manner in which the exact sciences are being taught today might prevent children from studying them later on in their lives? And do you believe that the manner in which the exact sciences are being taught, whether it is in high school or at university, is wrong from a didactic point of view? I am curious to know what you believe.