National Strategic Computing Initiative Essay

507 Words3 Pages

On July 29th, 2015, U.S. President Barack Obama launched a technology initiative known as the National Strategic Computing Initiative. This project will speed up the development of an "exascale computing system", a supercomputer that can process a billion billion operations per second. This exascale computer would allow the government to run detailed models of some of the world's most difficult problems, simulating solutions in ways that would not have been possible before. Such a system would be extremely valuable for dealing with massive scientific data sets, and would help solve pressing public policy issues.
Nowadays, more data are available than ever before. Anyone with a computer can easily find big data sets, run some regressions, and say that their decision-making is robust because it is based on big data. Although having more data are usually beneficial, having a large data set is not necessarily sufficient in creating good public policy. There is a sense that numbers from big data cannot lie, but data, like any other source, can have flaws such as being missing …show more content…

K-12 education system. This data tracks millions of students throughout many years. It should be easy to run a regression with this data in order to observe the education system, and create policy from there, right? This “perfect situation” is very far from reality. Firstly, many public schools do not have the funds to hire people to collect data. Often it is the school secretary or even the class teachers who have to enter data on their students. This leads to missing or incorrectly entered data. Another glaring problem is that different states, districts, and school collect different data. For example, one school I worked with while writing my thesis did not even track the gender of their students until I suggested it. If these types of things are ignored while creating national education policy, it could lead to disastrous

Open Document