Statistics and mixed methods

From Sustainability Methods
Revision as of 16:36, 13 June 2020 by HvW (talk | contribs) (Created page with "Mixed methods Much in modern science is framed around statistics, for better or worse. Due to the arrogance of "the scientific method" being labeled based on deductive approac...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Mixed methods Much in modern science is framed around statistics, for better or worse. Due to the arrogance of "the scientific method" being labeled based on deductive approaches, and the fact that much of the early methodological approaches were biased to and dominated by quantitative approaches. This changed partly with the rise or better increase of qualitative methods during the last decades. We should realize to this end, that the development of the methodological canon is not independent but interconnected with the societal paradigm. Hence the abundance, development and diversity of the methodological canon is in a continuous feedback loop with changes in society, but also driven from changes in society. Take the rise of experimental designs, and the growth it triggered through fostering developments in agriculture and medicine, for better or worse. Another example are the severe developments triggered by critical theory, which had clearly ramifications towards the methodological canon, and the societal developments during this time. Despite all these ivory towers is science not independent from the Zeitgeist and the changes within society, and science is often both rooted and informed from the past and influence the presence, while also building futures. This is the great privilige society gained from science, that we can now create, embed and interact with a new knowledge production that is ever evolving. Mexed methods arebone step in this evolution. Kuhn spoke of scientific revolutions, which sounds apealing to many. As much as I like the underlying principle, I think that mixed methods are more of a scientific evolution that is slowly creeping in. The scientific canon that formed during the enlightenmnet and that was forged by the industrialisation and a cornerstone of modernity. The problems that arose out of this in modern science were slowly inching in between the two world wars, while methodology and especially statistics not only bloomed in full blossom, but contributed their part to the catastrophe. Science opened up to new forms of knowledge, and while statistics would often contribute to such emerging arenas as psychology and clinical trials, other methodological approches teamed up with statistics. Interviews and surveys utilized statistics to unleash new sampling combined with statistics. Hence statistics teamed up with new developments, yet other approaches that were completly independent of statistics were also underway. Hence, new knowledge was unlocked, and science thrived into uncharted territory. From a systematic standpoint we can now determine at least three developments: 1) Methods that were genuinely new, 2) methods that were used in a novel context, and 3) methods that were combined with other methods. Let us embed statistics into this line of thinking. Much of the general line of thinking in statistics in the last phase of modernity, i.e. before the two world wars. While in terms of more advanced statistics of course much was developed later and is still being developed, but some of the large breakthroughs in terms of the general line of thinking were rather early. In other words, in terms of what is most abundantly being applied up until the computer age, much was already developed early in the 20th century. Methods in statistics were still developed after that, but were often so complicated that they only increased in application once computers became widely available. What was however quite relevant for statistics was the emergence into diverse disciplines. Many scientific fields implemented statistics into their development to a point that it was dominating much of the discourse (ecology, psychology, economics), and often this also led to specific applications of statistics and even genuinely new approaches down the road. Where statistics also firmly extended their map on the landscape of methods was in the combination with other methods. Structured Interviews and surveys are a standard example where many approaches served for the gathering of data and the actual analysis is conducted by statistics. Hence new revolutionary methods often directly implemented statistics into their utilisation, making the footing of statistics even more firm.

Triangulation of statistics within methodological ontologies First of all, I have to start with a discipliner. Triangulation is often interpreted in a purely quantitative sense, which is not true. I would use it here in a sense of combination of different methods to triangulate knowledge from diverse sources. Ideally, triangulation allows to create more knowledge than with single method approaches. More importantly, triangulation should allow to use methods in harmony, meaning the the sum of the methods that are triangulated is more than the sum of its parts. To this end, triangulation can be applied in many contexts, and may bring more precision to situations where mixed methods are not enough to describe the methodological design to enable for a clear communication of the research design or approach. Least not because of this dominating position that statistics held in modern science, and also because of the lack of other forms of knowledge becoming increasingly apparent, qualitative methods were increasingly becoming developed and utilised after the second world war. I keep this dating so vague, not because of my firm belief that much was a rather continuous evolvement, but mostly because of the end of modernity. With the enlightenment divinely ending, we entered a new age where science became more open, and methods became more interchangeable within different branches of science.


Quantitative vs qualitative The opening of a world of quantitative knowledge started a discourse between quantitative and qualitative research that has effectively never ended. Pride, self esteem and ignorance are words that come to my mind if I try to characterise what I observe still today in the exchange between these two lines of thought. Only when we establish trust and appreciation can we ultimately bring these two complementary types of knowledge together. From the standpoint of someone educated in statistics I can only say that it is my hope that words such as "significant", "correlated", "clustered" are not used as words, but as the analysis approaches and associated concepts that are developed and that these words stand for. What is even more difficult to me is when these concepts are rejected altogether as outdated, dogmatic, or plain wrong. Only if we can join forces between these two schools of though may we solve the challenges we face. Likewise is it not ok that many people educated in statistics plain reject qualitative knowledge, and are dogmatic in their slumber. I consider this great gap between these two worlds as one of the biggest obstacle for development in science, creating sever ripples into many arenas. People educated in statistics should be equally patient in explaining their approaches and results, and equally receptive and open minded about other forms of knowledge. We need to walk to extra mile to bridge this. If we cannot bridge it, we need to walk around the gap. Connect through concrete action, and create joined knowledge, which will become a joined learning. Explain statistics to others, but in a modest, and open approach.

Inductive vs deductive A fairly similar situation can be diagnosed for the difference between inductive and deductive approaches. Many people building on theory are open on a high horse, and equally claim many inductive researchers to be the only ones to approach science in the most correct manner. I think both sides are right and wrong at the same time. What can be clearly said for everybody versatile in statistics is, that both sides are lying. The age of big data crashed with many disciplines that were once theory driven. Out of their development limping behind the modern era, these disciplines are often pretending to build hypotheses, merely because their community and scientific journals demand a line of thinking that is build on hypotheses testing. Since these disciplines have long had access to large datasets that are analysed in an inductive fashion, people pretend to write hypothesis, when indeed they formulated them after the whole analysis was finalised. While this may sound horrific to many, scientists were not more than the frogs in the boiling pot, slowly getting warmer. This system slowly changed or adapted, until no one really realised that we were on the wrong track. There are antidote, such as preregistration of studies in medicine and psychology, yet we are far way from solving this problem. Equally, many would argue that researchers claim to be inductive in their approach, when they are in fact not only biased all over the place, but also widely informed by previous study and theory. Many would claim, that this is more a weak point of qualitative methods, but I would disagree. With the wealth of data that became available over the last decades though the internet, we also have in statistics much to our disposal, and many claim to be completely open minded about their analysis, when in fact the suffer also from many types of biases, and are equally stuck in a dogmatic slumber when they claim to be free and unbiased. To this end, statistics may rise to a level where a clearer documentation and transparency enables a higher level of science, that is also aware of the fact that knowledge changes. This is a normal process in science, and does not automatically make previous results wrong. Instead, these results, even if these are changing, are part of the picture. This is why it is so important to write an outline about your research, preregister studies if possible, and have an ethical check being conducted if necessary. We compiled the application form for ethics from Leuphana University at the end of this Wiki.

Spatial and temporal scales Statistics can be utilised basically across all spatial and temporal scales. Goobal economic dynamics are correlated, experiments are conducted on plant individuals, and surves are conducted within systems. This showcases why statistics got estalished across all spatial scales, but unstantly also highlights once more that statistics can only offer a part of the picture. More complex statistical analyis such as structural equation models and ntmetwork analyis are currently emerging, allowing for a more holistic system perspective as part of a statisticsl analysis. With the rise of big data does more and more data become available, allowing us to make connections between different data sources, and hence bridge different forms of knowledge, but also different spatial scales. While much of these statisticsl tools were established decades ago, webonly slowly start to compile datasets that allow for such analysis. Likewise are with the increasing availability of more and more data an increasing diversity of temporal dimensions emerging, and statistics such as panel statistics and mixed effect models allow for an ever evolving understanding of change. Past data is eaually being explored as attempts are made to predict the future. We need to be aware that statistics offers only a predictive picture here, and is unable to implement future changes that are not implemented into the analysis, yet may as well occur. To this end, statsitics needs to be aware of its own limitations, and needs to be critical of the knwoledge it produces in order to contribute valid knowledge. Statistical results have a certain confidence, results can be significant, even parsimonious, yet may only offer a part of the picture. Unexplained variance may be an important piece of the puzzle in a mixed method approach, as it helps us to understand how much we do not understand. Statistics may be taken more serious if it clearly demarks the limitations it has or reveals. We need to be careful to not only reveal the patterns we may understand through statistics, but also how these patterns are limited in terms of a holistic understanding. However, there is also reason for optimism when it comes to statistics. Some decades ago, statistics were exclusive to a small fraction of scientists that could afford a then very expensive computer, or calculations were made by hand. Today, more and more people have access to computers, and with it to powerful software that allows for statistical analysis. The rise of computer led to an increadible amount of statistical knowledge being produced, and the internet enabled the spread of this knowledge acrross the globe. We may be living in a world of exponential growths, and while many dangers are connected to this recognition, the rise in knowledge cannot be a bad, at least not all of it. More statistical knowledge bears hope for less ignorance, but this demands as well the responsibility of communicating results clearly and precisely, also highlighting gaps and limitations in our knowledge.

Possibilities of interlinkages We only start to understand how we csn combine statistical methods with other forms of knowledge that apied in parralel. Sequential combination of other methods with statistics has been long known, as the example of interviews and their statisticsl analysis has already shown. However, the parralel integration of knowledge gained through statistics and knowldge from other methods is still in its infancy. Scenario planing is anprominent example that can integrate diverse forms of knowledge, and other approches are slowly investigated. However, the integrational capability needed by a team to combine such different forms of knwoledge is still rare, and research or better scientific publications are only slowly starting to combine diverse methods in parralel. Hence many papers that claim a mixed method approach often actually mean either a sequential approach or different methods that are unvonnected. While this is perfectly valid and abstep forward, science will have to go a long way to combine parralel methods and their knowledge more deeply. Funding schrmes in science are still widely displinary, and this dogma often dictates a canon of methods that may have proven its value, some may argue. However these approches did not substantially create the normative knowledge that is needed to contribute towards a sustainable future.

Statistics and disciplines 800 words

Being a statistician in a mixed method world Becoming experienced at one method takes about as long as learning a musical instrument.

Bokeh an approach to mixed methods 500 words