The military funding of science has had a powerful transformative effect on the practice and products of scientific research in the field of warfare since the advent the practice of modern warfare. Advanced science- based technologies have been viewed as essential elements of a successful military. World War I is often called “the chemists’ war”, both for the extensive use of poison gas and the importance of nitrates and advanced high explosives.

Physicists also contributed by developing wireless communication technologies and sound-based methods of detecting U-boats, resulting in the first tenuous long-term connections between academic science and the military.

World War II saw a lot of work on radar and was widespread and ultimately highly influential in the course of the war; radar enabled detection of enemy ships and aircraft, as well as the radar-based proximity fuse. And one of the most recent advances is seen by the Afghan model of warfare uses indigenous allies to replace American conventional ground troops by exploiting U.S. airpower and small numbers of American special operations forces.

Some argue that this model is widely applicable, enabling a major restructuring of the U.S. military and considerable freedom for American military intervention. An assessment of such claims in light of recent combat experience in Afghanistan and Iraq, however, finds the model’s applicability to be more limited. Where U.S. allies have had skills and motivation comparable to their enemies’, the Afghan model has proven extremely lethal even without U.S. conventional ground forces.


But where U.S. allies have lacked these skills, they have proven unable to exploit the potential of American airpower. The model can thus be a powerful tool, but one with important preconditions for its use and these preconditions limit its potential to transform U S. force structure or defense policy.

The impact of the longer land wars on civilian mortality during this period was often extreme. The reasons for this had little to do with the fighting itself. Wartime civilian mortality crises were precipitated by fatal epidemic diseases and starvation. Modern demographic historians attribute the starvation to military supply systems that stripped civilians of food and the means to acquire it, and the epidemics to decreased resistance to disease caused by under nutrition and to increased rates of disease transmission brought about by troop movements and civilian refugee flows.

There are two problems with these explanations, one substantive, one methodological. Substantively, they provide little scope for explaining variations in the demographic impact of early modern warfare; in particular, they cannot easily explain the striking anomaly of the English civil wars, during which the population increased rather than decreased.

Mathematical cryptography, meteorology, and rocket science are also central to the war efforts. The technologies employed include jet aircraft, radar and Proximity fuses, and the atomic bomb; military leaders have come to view continued advances in technology as the critical element for success in future wars. Whole new fields, such as digital computing, were born of military Patronage. The latest research is being conducted in the field of quantum electronics.


Countries presently having a higher technological advancement are being known as world leaders and hence in the present scenario technology means everything. So it is very evident that nations are using technology for both developing modern warfare weapons and improving life for people at the same time.

Science is definitely opening a whole new frontier to modern warfare, but at the same time threatens to wipe out a majority of the human population. These new advancements should be used to help mankind and improve life for millions of people. So the question is at the end have we learnt from the past or does the future seem bleak for humans, as another world war will only lead to widespread destruction.