For 24 hours in early November, we held the 2nd GA Tech Code for Good Student Hackathon. In continuation of last year's event here, we retained the theme of teaching healthy lifestyle choices to combat childhood obesity. From edutainment to exercise games, we seek to create worthwhile projects that can help an at-risk demographic: our future.
Recent innovations in data warehousing and business analytics dramatically increase the capability and potential value of today’s massive, diverse, and often fast-moving data flows. Companies now perform interactive queries and predictive analytics using all available data, including operational data and the huge amounts of poly-structured data available from logs, social networks, sensors, and many other sources. In this white paper, we define a practical, cost-effective infrastructure for supporting data-driven decision-making on an enterprise scale.
The business potential of big data analysis is enormous across virtually every business sector. The Intel IT organization has implemented use cases delivering hundreds of millions of dollars in business value. This paper discusses a few of those use cases and the technologies and strategies that make them possible. It also defines the architecture we use for big data analysis and provides an in-depth look at one of the most important components—an Apache Hadoop* cluster for storing and managing large volumes of poly-structured data.
I had an interesting question come across my desk a few days ago: “Is it still worthwhile to understand T-states?” My first response was to think, “Huh? What the heck is a T-state?”
Doing a little more research, I discovered that, yes, there is something called a T-state, and no, it really isn’t relevant any more, at least for mainline Intel(R) processors.
Let me say this again: T-States are no longer relevant!