1. The fundamental problem with computational science is lack of educated personnel- they don’t solve- this is their own solvency evidence
Benioff et al 5
(Marc R. Benioff and Edward D. Lazowska,PITAC Co-Chairs, President’s InformationTechnology Advisory Committee, http://www.nitrd.gov/pitac/reports/20050609_computational/computational.pdf) chip
In addition, our preoccupation with peak performance and computing¶ hardware, vital though they are, masks the deeply troubling reality that the¶ most serious technical problems in computational science lie in software, usability, and trained personnel. Heroic efforts are regularly devoted to¶ extending legacy application codes on the latest platforms using primitive¶ software tools and programming models. Meanwhile, the fundamental R&D¶ necessary to create balanced hardware-software systems that are easy to use,¶ facilitate application expression in high-level models, and deliver large fractions¶ of their peak performance on computational science applications is perennially¶ postponed for a more opportune time. More ominously, these difficulties are¶ substantial intellectual hurdles that limit broad education and training.¶ The PITAC’s Call to Action¶ The PITAC believes that current education and research structures and¶ priorities must change radically if the United States is to sustain its world¶ preeminence in science, engineering, and economic innovation. We are not¶ alone. For two decades, organizations in¶ government, academia, and industry have¶ been issuing reports recommending¶ sustained, long-term investment to realize¶ the benefits of computational science. As¶ Sidebar 2 notes, these calls have had only a¶ limited impact. Instead, short-term¶ investment and limited strategic planning¶ have led to excessive focus on incremental¶ research rather than on long-term, sustained¶ research with lasting impact. Furthermore, silo mentalities have restricted the¶ flow of ideas and solutions from one domain to another, resulting in¶ duplication of effort and little interoperability.
2. Computational transportation systems fail- data overload
Winter et al 10
(Stephan Winter: The University of Melbourne, Australia, Monika Sester: Leibniz University Hannover, Germany, Ouri Wolfson: University of Illinois, Chicago, USA, Glenn Geers: National ICT Australia, Sydney, Australia, ACM SIGMOD Record, Volume 39 Issue 3, September 2010 , Pages 27-32, http://www.cs.uic.edu/~boxu/mp2p/39-135-1-SM.pdf) chip
In large cities and on congested roads the data density will be vast. For individual¶ travelers, and the devices and systems that are assisting them in making a journey,¶ only a small fraction of the received data will be relevant (and even less will be useful)¶ 4Since there is no guarantee that the data available to a traveler are of useable quality or¶ even available when needed, filling the spatial and temporal data gap is a challenging¶ issue. Is it meaningful to fill the gaps with data from yesterday or even a minute ago?¶ Can statistical machine learning techniques such as support vector regression help?¶ The answers are not clear and must depend on what the data is to be used for. After¶ all a bus timetable is simply a prediction of often dubious reliability.¶ 4. Visualization of the huge, multi-dimensional data sets generated will not be easy.¶ Many users will have their own requirements and will want to construct queries and¶ visualize the results. It is unlikely that the mobile device of an individual user will¶ have the computational power or storage for such a task. Will cloud computing come¶ to the rescue? Will peer-to-peer systems help with data storage and download? The¶ physical presentation of the data is also an issue. An in-vehicle display must not be¶ too obtrusive or interfere with the driver’s ability to control the vehicle (at least until¶ the vehicle is fully autonomous). Questions of relevance, urgency, and safety need to¶ be addressed.
3. Computer models fail to solve warming- corruption in IPCC scientists ignoring real data to receive money
Young 09
Greorge Young ( a neuroscientist and physicist, a doctoral graduate of the University of Oxford, Oxford, England, whilst previously completing postgraduate work at King's College, University of Aberdeen, Scotland, and having taught graduate-level Statistical Analysis and Mathematical Modeling. He currently chairs a privately funded think-tank engaged in experimental biophysics) 5/31/09
(“It's the Climate Warming Models, Stupid!”, http://www.americanthinker.com/2009/03/its_the_climate_warming_models.html) chip
In addition to the difficulties mentioned above, is the late arriving Anthropogenic (man-made) Global Warming (AGW) prejudice that has set the evolution of climate modeling back a few decades. Previously known and accepted climate components have been summarily stripped from the equation -- such as the dominant factors involving the Sun and the importance of water vapor in the atmosphere as the dominant greenhouse gas. This is because in the cause to acquire lucrative AGW-biased government grants, many scientists have opted to blatantly skew their climate models to amplify AGW-favoring evidence and amplify anthropogenic CO2 importance. In this manner, they then qualify to receive funding and ensure publication. ¶ Describing the compounded inaccuracies of these Johnny-come-lately modelers who would rather be funded than scientifically astute, Dr. Tim Ball, a former climate scientist at the University of Winnipeg sardonically clarifies: "The analogy that I use is that my car is not running that well, so I'm going to ignore the engine (which is the sun) and I'm going to ignore the transmission (which is the water vapor) and I'm going to look at one nut on the right rear wheel (which is the Human produced CO2) ... the science is that bad!"¶ Dr. Balls analogy has never proved clearer than when examining the climate models used by the UN's Intergovernmental Panel on Climate Change (IPCC). As just noted, the inaccuracy of those models cherry-picked by the IPCC revealed that the largest and most robust variables of climate change and their antecedents were intentionally dismissed and dropped from inclusion in their investigations, including the variables of solar activity, water vapor and cloud formation in the atmosphere, major ocean currents, as well as other vital components. ¶ If you're thinking that without due consideration of the known and most weighty variables in the climate system, the forecastable conclusions should prove to be fallacious and wrong, you would be right. Yet, that hasn't stopped the UN's IPCC from driving the propaganda of AGW, emphasizing the wrong deductions while deliberately disregarding the bigger picture altogether.¶ Ironically, model worthiness and accuracy can be quickly assessed by simply plugging in yesterday's numbers and seeing if the model actually yields results that are aligned with the known history. Yet to date, climate models have failed miserably. Though there is hope for further improvement, there is no current climate model that can, when applied to the documented past, accurately re-forecast the known historical record, much less portend what could be happening to the weather next week, least wise the next century. Climate modeling has yet to rise to a level of sophistication that allows us to accurately predict the future.¶ Knowing the primitive state of climate modeling, it is at least irresponsible, even not maleficent, to use such flawed methods to intentionally affect global public policy-making. It is morally reprehensible, if not criminal, to promote the panicking of dire climate consequences and extinction scenarios authored by climate models known to be verifiably defective. This tyranny of appearance has yet to be toppled.
Share with your friends: |