Choosing the Right Axis

An institutional history of the Belgrade Eta forecast model

Authors

  • Vladimir Jankovic´ University of Manchester

Abstract

Since the 1950s, it has become a usual assumption to look at the dramatic developments of computing technologies as a necessary condition in the business of weather prediction; the use of high-speed state of the art computers has been repeatedly described as a foundation for solving the complex mathematics of the even more complex physics of the atmosphere. Not only has the super-computer shortened the time of forecasts, it has also made an impact on theory in that it put numerical meteorology center stage. Thus Jule Charney wrote of the “great psychological stimulus that the very possibility of high-speed computation brought to meteorology”, and Akira Kasahara claimed that “when it comes to the bottom line, whoever has the faster computer will win”. Kasahara was right: the people and institutions that developed and applied numerical weather forecast have always had access to the best machines and a steady income: Charney’s group worked on von Neumann’s Meteorology Project on ENIAC and the IBM machines at Princeton and Maryland in early 1950s. Thompson and Platzman gave courses in numerical prediction at MIT and Chicago in 1953, while the main research groups included the Air Force Cambridge Laboratory, the British Meteorological Office, and the International Meteorological Institute of the University of Stockholm. In Russia, Obukhov, Buleyev and Yudin worked on STRELA computers and kept pace with the groups on the other side of the iron curtain.

Downloads

Published

2004-12-01

Issue

Section

Articles