From Sustainability Methods

In short: This entry revolves around the history of Time as a concept, and its implications for scientific inquiry. Please refer to the Design Criteria of Methods entry for a general overview on all design criteria.

Time is one of the greatest misconceptions in terms of methods (and humans in general), and together with space (or spatial scale, or grain) poses one of the key challenges in methodological aspects of science. Here, we give a short introduction to the different aspects of time from a methodological standpoint. Starting with the most relevant general aspects of time, the text will then focus on concrete methodological aspects of time. An outlook on the necessary next steps concerning time in scientific methods concludes the text.

A historical view on time

We humans are known to be a mere glitch in the long cosmic developments that lay already behind us. Much time has already passed, probably billions of years, and we as humankind have only been here for a few thousand or a few hundred thousand years, depending on how you define us. Yet within this comparably short or even negligible period, we have become an entity that - according to Derek Parfit - starts to understand the universe. In the long run, this may be our most remarkable achievement, and we already understood as much as that we will probably not be there forever. This has led to many debates about the reasons for our being, and our role in the universe, and these debates will likely not end anytime soon. There exist diverse flavours to make sense of our presence, and while I invite everyone to participate in this great abyss of a debate, I am sure that it will be difficult to come up with a conclusion that we all share. Following Derek Parfit, we may agree that our future could be much better than our past, and it would be worthwhile in this spirit to go on, and to contribute to a future of humankind that could be so much better than pur past. Nihilism, existentialism and many other -isms are telling us to not buy into this optimistic rhetoric, yet personally, I would disagree.

Thus, let us focus on the main obstacle to this bright future that we have faced ever since we began: Temporal autocorrelation. This principle defines that we humans value everything that occurs in the close past or future to be more relevant than occurrences in the distant future or past. This is even independent of the likelihood whether future events will actually happen. As an example, imagine that you want to have a new computer every few years, and you can pay 5€ to have a 50% chance to get a new computer tomorrow. If you actually needed one, you would surely do that. However, even adjusted for inflation, many people would not pay the same 5€ to have a 50 % chance to win the latest computer in 10 years. What difference does it make? It is the latest tech in any case. Temporal discounting is one of the main ways how people act unreasonably. This even extends well beyond the point where we are dead already, although this also plays a role. Our unreasonable inabilities to transcend temporal discounting extend likewise into the past. The strongest argument to this end is how many people insist that they are worse off today when compared to the past - in the 'good old days'. The mistake that is typically made is that some aspects in the past were better, while many other aspects were worse. People hence have a tendency to value the time period closer to them different from the time periods more distant. For the past, only the good things are remembered, and for the present, the negative sides are also acknowledged. It widely depends on other circumstances whether this unreasonable evaluation is better or worse than a more objective view on each time period. In any case, the sheer difference between how people evaluate distant time periods or events compared to closer ones is one indirect motivation for science to put this knowledge into perspective.

Understanding the past

Historical research as well as other branches of science that investigate data or information from the past allow us to put our current knowledge, context and situation into the context of past periods or events. This is widely driven by the historical sources that are at our disposal. Palaenonthology, which is the science that deals with the oldest records, is a good example how single artefacts and findings can be embedded into a quantitative timeline via carbon dating. Be measuring the decay of radionucleids in the objects, it is today possible to get a rather good tracking of the age of fossils. However, before the establishment of this hard scientific method, palaeontologists and geologists relied widely on the context of the sediments in which the fragments were found. The sedimental history hence become an early dating method, and many other indicators such as thin layers of global catastrophes through meteors allow to pinpoint the temporal origins with an often high precision. One prominent example is the layer at the Cratecous-Tertiary line, when an Earth-shattering meteor not only extinguished the dinosaurs, but also created a thin sedimental layer that allows to date this event in time with often remarkable precision. Other important methods in dating distant developments can be hidden in the molecules and systematics of many organisms, since sophisticated DNA analysis often allow to generate a holistic developmental history of organisms if suitable material for DNA analysis is found. To this end, preserved record in moorlands is often a source of material dating back 10 thousands of years, creating a database of the development of life itself.

Human development can be preserved as part of the archeological record, which is often a deeply qualitative analysis in describing and contextualising relevant artefacts. Written accounts and paintings are for instance one of the vital sources we had before the invention of photography. Before the written word, but also during its existence, there is the archeological record that investigates surviving pieces - quite literally - of the past. This leads to a bias that we all - unfortunately - suffer from: for instance, much of the archaeological record is preserved from the temperate or subtropical zones of planet Earth, where erosion, wind and weather did not destroy not as many remnants of the distant past, while in the tropics much of what once was has been destroyed by the elements. Hence, little information is for example found in sub-saharan Africa, despite indirect hints pinpointing to long-dating cultures, and the rare fossil founds testify even the origin of humans as a people (Homo sapiens) in Eastern Africa. Importantly, the precious testimonies of the past are often part of a bigger emerging picture of humankind, hence expert knowledge of a specific period or culture is pivotal in contextualising each relevant findings in a systematic way. These archaeological records can be locally clustered, opening windows into regional developments, yet sadly omitting other developments, especially in environments that are unfavourable in preserving precious artefacts, such as the tropics. This exemplifies that our understanding of the past is often like looking at a puzzle of which we have only some few pieces left, making it difficult to make out a whole picture. One methodological approach can be to actively try and reproduce the living conditions of the past, and maybe find out what certain artefacts were being used for.

Once we reach the time of the first modern civilisation, the fossil record becomes more abundant, and whole buildings or even settlements are preserved. Early painting and pieces of art give testimony of the ethical understanding of our ancestors, and with the preservation of language the road is paved to a direct documentation of past time. The Stone of Rosetta is a famous example of a key to unraveling the ancient languages of the Egypts, allowing to translate much of the record preserved in tombs. Early texts in the East were often preserved distant from their origin, with the Pali canon being a famous example of texts that were literally all destroyed in the region of origin in India due to the Muslim invasion. On the other hand, much of what was written by the ancient Greeks has been preserved in Persia and the Muslim world, and the Bible is a famous example of a text that found its way to Europe through the Greek translations. Hence the analysis of texts is among the first systematic scientific methods of analysis, and dates back to the origins of texts itself. All world religions build on a -often critical - analysis and annotation of the texts closest to their origin, often triggering an early diversification in the different branches of world religions. Different interpretations of the Koran or the Pali canon hence lead to the main streams of these religions, and in the case of the Bible was it the translation of the original texts into lay people's language that triggered not only a new branch of Christianity, but also centuries of war and destruction. This shows how normative text analysis can be, and what the direct consequences are in the world.

Equal considerations can be made for the early methodological approaches that rose to prominence in the antique, the most notable among them being astrological observations, mathematics, physics, or early medicine. It was often within urban cultures that such early examples of scientific methods flourished, and are still known today. A noteworthy example are the precise astronomical calculations of the Maya, which were often closely linked with their daily conduct, since the movement of the planets played a vital role in their religion. Their writing system shows similarity to the Japanese writing and many other approaches that provide a link to a documentation through paintings and thus deserve a deep interpretation well beyond the content of the symbols itselves. This documentation hence now only shows us how scientific methods emerged, but also allow us to investigate the past with our current methods. To this end, paintings are a form of art that deserves a closer analytical look, and many diverse approaches exist that allow for a close examination of the preserved record. It was the financial surplus in urban settlements that led to the developments of much what is now know as different epochs of painters, and the digital age has made many paintings accessible through the Internet. With the representation through paintings, there is also a rising record of maps, and other forms of more systematic representations that give testimony of a clear geographical picture of the past. Equally, the texts of Herodot can be seen as one of the earliest geographical accounts, albeit in writing. The scientific analysis of these records is vital to understand the past, and the density of the historical record has often increased the closer we move to the present.

With the rise of photography an altogether different kind of record was created, and the development of modern cartography had an equally dramatic influence on the density of knowledge that became available. The detailed scientific examination of the diverse sources since the development of the printing press by Johannes Gutenberg was already a a severe development leading to an exponential growth of the printed word. The development of printed images and multiplication of pictures that came later was an altogether different media unleashed, leading to a totally different world. Once these image started moving and even talking, human civilisation created an exponentially growing record of the world, of which the Internet is the latest development. From a methodological standpoint, this diversity of media triggered two relevant developments: An ever-increasing differentiation of empirical analyiss, and severe philosophical consequences of this brave new world. The role of art and its critique changed fundamentally, with consequences that translate directly into modern society. It is impossible to do justice to these developments here, yet Walter Benjamin should be noted as an early stepping stone towards a clearer role of the critique within society. This triggered direct and long overdue consequences for the scientific method as well, and led to a redevelopment of the role of the critical observer, commentator and ultimately unleashed a more nuanced view on science and its consequences. Methodologically, deconstruction as well as the critical perspective emerged over the last decades, all the while the possibilities of modern sciences diversified as well. The exponential track of the 20th century triggered a diversity in scientific approaches that dwarves by far the scale of everything that existed before. Along the way, our understanding of the distant past has become a question of the preservation of the immense emerging record. The amount of information that is being created and stored on the Internet is increasing at a pace that would have been unthinkable before, leaving us to at least try to spare some thoughts about the role this information may play in the future. Will everything be preserved for longer times? What is the turnover of our data? And what will be the meaning of the information we create for future people?

Understanding the future

Derek Parfit concluded that our future may be wonderful, and that we cannot make the ethical decision whether future people should exist at all. In other words, we have no moral grounds to end human history. The fact that humans can think about the future is one of the most defining elements about our species. Compare us to the chipmunk. The chipmunk may store nuts, and forget about most of these. Birds migrate in anticipation of the seasons changing. Whales may follow their food. It is probably only us who have a abstract understanding about different futures, and can adapt our actions based on this knowledge. To do so, the scientific examination of our futures became more and more systematic over the last centuries and especially decades. We know that the earliest cultures cared about their future - the artefacts found in graves showcase the complex world our ancestors anticipated in the afterlife. Some of the earliest texts offer a testimony of what might happen in the future, often as a motivation or a basis for moral authority for the living. Moses' interpretation of the dreams of the Pharao showcases how the anticipation of a possible future and its shortcoming was central to the ancient culture.

While the oracles and mysticisms of the ancients were often complicated yet not systematical in their methodological approaches, this changed with modern agriculture. Human civilisation got domesticated by its crops, and depended on their harvest. The demise of the Maya may be an early testimony of crop failures, and especially in Asia and Europe, records of the central role of the harvest within the seasonal calendar have been preserved for centuries. The ripening of the wine harvest is a record often known since several centuries, and deviances from the known averages often led to catastrophic famine and migrations. To prevent such catastrophes, societies began to index and plan their agricultural practice into the future, and the rise of numbers - with examples from basically all early cultures - testify how this allowed for the thriving of many diverse cultures.

However, also more qualitative and vivid approaches to the future emerged in the literature and the other arts, among them More's Utopia as an early testimony on how he imagined a radically different society already in 1516. Sadly, Christopher Columbus in his anticipation of a new path to India via the East triggered a reality that was - and still is - so different from More's anticipation of the future. The human urge for discovery is what drove many people into the New World, often looking for a more hopeful future. The people that already lived in these regions of the world had no means to anticipate the grim futures most of them would face under these developments. Through the rising inequalities of colonialism and other deeply regrettable developments of the rise of Europe, a surplus in resources was extracted from the colonies, enabling an economic system that was built around a more systematic exploitation of the regions that would become the Global South. The urge to make a business became the backbone of the rise of utilitarianism and its associated economic paradigms, each willing to cash in on an anticipation of their future, typically on the costs of the future of other people. This showcases how the capability of an anticipation of the future, and being privileged with the means to act based on that, created an inequality that is basically still on the rise today.

Scientific methods such as Scenario Planning were initially tools of a mercantile elite willing to improve their future chances of profit. Many of the dramatic environmental catastrophes of the 20th and 21st century were rooted in the systematic ignoring of small or even imperceptible risks that still became a reality. Chernobyl, Minamata, Bhopal, the Exxon Valdez and the Aral Sea are testimonies on how risks were perceived incorrectly, with dire consequences for the environment and the people in it. Anticipating the potential futures has branched into diverse possibilities rooted in the emerging methodological canon. Examples are several approaches known under the term Machine Learning which allow for a data-driven anticipation of future developments, often with a focus on correlative patterns. Another development in science is the inclusion and joint learning of science and society together. Methodological approaches such as Visioning allow for a more integrational perspective of potential futures based on the diverse knowledge of involved actors. Often applied in sustainability and futures research, they also hold normative implications for the kind of future we want, and the ones we don't.

Understanding change

Rooted in the agricultural experiments about 100 years ago, so-called longitudinal studies investigate in a designed methodological setting how changes can be quantified over time. Rooted in the experimental approaches of the Analysis of Variance, these statistical methods became the gold standard in clinical trials, and enabled such different disciplines as ecology and psychology to systematically design studies to quantify changes over time. However, such approaches have increasingly been considered to be too rigid to adequately consider the diversity of parameters that characterise real world dynamics. The Corona pandemic is an example where many dynamics can be anticipated, such as the increase in numbers, and other dynamics are investigated in longitudinal studies, such as the effectiveness of vaccines. However, some changes can also not be instantly anticipated or investigated, which highlights that in order to explore future dynamics, longitudinal studies may enable some focal points to investigate, yet more complex dynamics may demand other approaches.

Longitudinal studies may deserve an own Wiki entry in the long run, yet what is most relevant here is that designing such studies demands a clear recognition of the time intervals and anticipated hypotheses that are being investigated. While more and more temporal data becomes available, most prominently through the Internet, and this data can be analysed by diverse methodological approaches, it is a general rule of thumb that temporal analyses of quantitative data demand a high level of expertise. This can be related to four main challenges within research that investigates more than one current snapshot in time:

Types of temporal changes

A general awareness of the different ways how changes can manifest themselves over time is a precondition for a selection of a specific method. To this end, we can distinguish between three general patterns within changes over time: Linear changes, non-linear changes, and repeated patterns. Many patterns that can be derived from changes are following linear dynamics. A lack of such linear dynamics or changes may be rooted in a low data density, yet human perception is also often following non-linear dynamics. We are for instance able to perceive a light to be on or off, yet the fine change between these two states is too rapid to be visually perceived by humans. This is different from fire, where a smouldering fire is not really burning, but it is also not cold. A pile of wood will start to burn, the fire will eventually spread, and then slowly fade away. Thus, an electric light being switched on is a non-linear pattern, while a fire burning is more linear. Examples of sudden changes are prominent in physics, yet also found their way into theories in economy or ecology. Within psychology or social science this is even more complex because of the peculiarity of human perception, much of which is not linear, but instead we think in groups or levels. This may be the reason why it is so difficult for us to actually establish a gradual line of perception. Take the example of being happy, or being sad. We can often say when we are either happy or sad, but what about some in-between state? Can you define when you are half happy and half sad? Hence the 'glas-half-empty-and-half-full'-allegory. Such normative groups tend to force us to think in categories, often in dualities, which is deeply embedded in Western culture, yet also beyond. This is the point when we should acknowledge the value of a critical perspective at least as much as the value of qualitative methods when it comes to perceptions. This testifies one of the direct peers of critical realism, and the associated shrinking importance of personal identity but not of personal perception. Researchers should want to understand more about subjective perspectives, and try to integrate them, and realise especially how these may change over time. Exciting research emerges to this end, and new methods allow to unravel human perception from a qualitative perspective, thereby allowing for a more critical viewpoint on many of the categories through which we tend to see the world. However this does not only depend on the constructs in which we tend to group the world from a quantitative or qualitative categorical viewpoint, but also the temporal grain that we establish in our perspective.

Temporal grain and measures of time

Temporal grain can be defined as the temporal resolution at which we observe a certain phenomenon. Take again the example of the electrical lightbulb. If we switch it on, we see that it is instantly on. This can be - sometimes - different in a Neon tube, which in some occasions needs a few seconds until it is filled with plasma and becomes bright. Here, there is a delay between the off state and the on state - an in-between state. If I now close my eyes for five seconds, and only open them at a regular interval every five seconds, and then close them again, then this in-between state would be missed by me. Such temporal grain effects are crucial in our view of the world, since we can only observe what is within the frequence of our observations. Within quantitive analysis, this is also closely related to the question of statistical power, since each time step needs to be sufficiently supported by data. If you conduct repeated Interviews with the same people, one missing interview may make the whole data point useless, depending on the analysis method that is being used. Equally important is temporal grain in qualitative approaches, where repeated observations or gathering of information may hold a key to human perceptions and their change. If you want to inquire the mood of a participant in a study, then it is clear that you do not want to ask the person every minute how they feel, yet you equally do not want to ask the person every month, because more often than not, moods shift quickly. Another good example of the relevance of temporal grain in qualitative approaches is the analysis of the paintings of an artist. When we look at Picasso there are clear periods that are widely accepted today, and Georgia O'Keeffe is yet another example where periods are well known. While typically there is a close process in the analysis of the work of artists with the biography, a special role falls to the critique of art in this context, which - in a nutshell - demands not only to recognise but also to transcend the context, including the social and societal circumstances surrounding the origin and influences of the art. Since this make the critique an methodological approach well beyond the mere reflection on the art itself, it creates new knowledge, and is often a highly relevant analysis of societal developments. This impact of temporal grain is often overseen or not contextualised in the recognition of scientific methods, and will surely increase even more with a more diverse and less disciplinary science.

Temporal relativity

The aspect of temporal perception is closely linked to the fact that time is normative. Since we perceive time differently, it is not objective. Consequently, everything that can be associated with time is equally not objective. We should not confuse this with the mere fact that some aspects of time are more precise than others. The invention of the chronograph as a means to locating a ship in the vastness of the oceans allowed not only for a more precise navigation, but also for a more efficient abuse of the regions known as colonies back then. Hence, even the mere consequences of time or its more precise utilisation are relative and normative. Yet what about time itself, as a physical entity? Since Einstein it became clear that under a clever experimental settings, twins may have a different age, depending on their relative travels through the Universe being different. That does not mean that everything is relative, but it can be. The discovery of gravitational waves is another example of a physical postulate that has been proven by a precise calculation of time. Within such experimental settings time is of the essence. If the 20st century was the period of Utilitarianism and its failures, then this regime ran more often than not on a strict time regime. Just as the results of science rooted in positivism were increasingly criticised and questioned, there is an equally strong urge to question scientific results once time has passed. While it is unclear how much time leads to sufficiently different circumstances to make results questionable, the same process can even be - counterintuitively - reverse. Many of the models and results predicting the effects of climate change have been more severe in reality than predicted years of decades before. Climate change became, sadly, a classic example where many models were actually too conservative compared to reality. It is a weird plot twist that this fact has become an argument of climate change deniers, which showcases that the means justify all ends, even our own. This links to the ultimate end, or at least how it is potentially proclaimed. Since Nostradamus and probably long before, the end of human kind as predicted by prophets and later scientists has been used as threat to make people submit, or at least leave them in awe, if not fear. The end of time (or at least humankind) has been a continuous motivation for all sorts of actions, and uncountable apocalypses came to pass once the due date went by. Once more, time proves its normative capacity, at least until some point. How strongly people feared these ends showcases the normative strengths time can encapsule, and more methodological dimensions of time may unlock in the future.

Methodological viewpoints of time

Instead of glancing to the horizon, let us take a step back and look at the main points we should all agree upon when it comes to time concerning the nexus of scientific methods.

Within scientific methods, documentation is key. Scientific studies should either produce knowledge that is reproducible, or the documentation of the knowledge production process should enable a seamless understanding of the overall process that led to the scientific result. The methodological design is often deeply rooted within the norms of scientific disciplines, making it a highly normative process. Concerning time, this is insofar relevant that researchers need to clearly document when, for instance, data gathering and analysis were conducted. The Internet is a good example of an often unstable data source, hence the date when you accessed specific information may be the key to the results you got often much later based on your analysis. On the other hand, the date of your analysis also reveals whether you took the latest information into account. While next to no one does that - myself included - the Corona Pandemic is a prominent example how rapid knowledge emerges and even changes. To this end, citations are like indirect time stamps since they allow the reader to understand which knowledge went into the conclusions you had. This helps however very little if the knowledge you used is primary data, for example a database with numbers of infected people. Yet just as you would clearly document the date on which you conducted specific Interviews, it is advisable to have a meta-file that contains information about the time stamps of your research. Natural science often have lab books for this, and ethnographic research has equally clear examples how such emerging data and knowledge can be documented. Research diaries should thus emerge as more important in the coming years, because they allow for a critical examination of the temporal dimensions of our own research processes and conducts in retrospect. These existing accounts will allow us deeper insights into research processes and mechanisms that many people assume to be rational, yet as Larry Laudan rightly concluded, research is often not exactly rational. Instead we talk about perceptions, interactions and often also serendipity, making the documentation of our research processes a key endeavour where more possibilities, norms and standards will continue to emerge.

Another key aspect where time patterns within research conduct is workload. It is almost a cliche how much people underestimate the time that needs to be put into research to arrive the respective results, and again this is often a deeply personal process. How many people finished their thesis in time? Me, certainly not. Making a time plan and overview of milestones for your research is surely worth your while, yet you should not feel bad if this then changes later on - it almost always does. Planning your research is the ultimate joke that time plays on us folks, and learning to become realistic to this end is something one may thrive for a whole life. So why should it be different when it comes to the perception of time?

Temporal aspects of the combination of methods

A crucial aspect that deserves more reflection when it comes to scientific methods and time is the sequence in which research is conducted. While this was already mentioned before, a more detailed account is necessary in order to present general principles in the temporal sequence in which methods are conducted. The concept of triangulation is one example, where for instance quantitative and qualitative methods are combined and the respective results are being compared. Ideally, this could also be process where the sequence of methods being used is enabling a higher specificity in order to answer the research question. An example would be to make a broad survey within a community, become more focussed with semi-structured interviews, and then gain really deep knowledge in ethnographic observations of some key actors in the community. Reversing this order would make hardly any sense. Still, in medicine it is often single cases from clinical practice that may inspire larger complex investigations through clinical trials, thus while in the first example the specificity increases, in the second example it is actually reversed. A key challenge when increasing specificity is to maintain validity, which is often scattered in the beginning, but becomes more focused in a later stage.

Within larger research projects, different methodological approaches are often used in parallel instead of a sequential order. While one branch of a project may make workshops, another branch may focus on ethnographic observation, and yet another part revolves around an experiment with an intervention. More often than not, there is little anticipation how these different forms of knowledge are integrated, as the united goal is not always straightforward. To this, end, integration is a frontier that is not always in focus, and it will be a challenge how to find a consensus within most interdisciplinary fields of research beyond medicine. Interestingly, in medicine such temporal sequences or parallel conducts are more established and follow clearer guidelines and also practicalities within clinical practice. The united goal to help the patient may serve a strong integration purpose to this end, with the overall goal being less clear in other research let alone practice.

Another often overseen aspects regarding time in the combination of methods are the aspects of long term observations. Due to the funding in research hardly exceeding some few years, long-term research is rare. Especially maintaining longer experiments or investigating a larger dataset e.g. within the arts, it is often difficult to generate knowledge that can deviate from knowledge synthesised in a rather short period. More focus on long-term research is needed in order gain the knowledge necessary when facing the problems we need to overcome. Climate change is a rare example where institutions such as the IPCC make sure that enough momentum and coherence was generated to create a body of knowledge that integrates millions of working hours of researchers, looks at incredible large and diverse dataset, and creates an integration and synthesis of these often long-term datasets that allows for clear policy recommendations.

The future of temporal aspects in methods

The problems we still face regarding temporal aspects of research may all come to pass in future research. The last decades have already shown how dense the available data become in many aspects of science, and how digital communication and means of documentation led to a new era of research when it comes to temporal aspects of scientific methods. Analytical methods allow for longitudinal analysis unheard of some decades ago, and machine learning unravels altogether new ways of research. Online libraries of art allow a broader access to data, yet all these exciting technologies cannot lead us to ignore the key challenge: Long-term research and funding are to date rare, and the role of academia within society will need to change in order to allow for research that is more continuously embedded. This would allow us to learn from past mistakes, and utilise more diverse approaches to gather knowledge about the different futures, and how we may reach these.

The author of this entry is Henrik von Wehrden.