Banner
    Hidden Benefit To The ACA: It May Help Bring Science 2.0 To Pass
    By News Staff | July 16th 2014 11:31 PM | 2 comments | Print | E-mail | Track Comments
    The Affordable Care Act and data portability is forcing health care providers, and the vendors who service them, to accelerate development of tools that can handle an expected deluge of data and information about patients, providers and outcomes.

    The volume of data is daunting - so are concerns about interoperability, security and the ability to adapt rapidly to the lessons in the data, writes Dana Gardner at Big Data Journal.

    That is why Boundaryless Information Flow, Open Platform 3.0 adaptation, and security for the healthcare industry are headline topics for The Open Group’s upcoming event, Enabling Boundaryless Information Flow on July 21 and 22 in Boston, he notes.

    Solving the issue will take a combination of enterprise architecture, communication and collaboration among healthcare ecosystem players. It's no secret that Collaboration and Participation are the big missing puzzles in the Science 2.0 mission.

    What about informed consent in a world where, if estimates are correct, 90 percent of the world's data has been generated in the last two years? Has it become meaningless? Joe McNamee, executive director at European Digital Rights (EDRi) and Alex ‘Sandy’ Pentland, academic director of the MIT-Harvard-ODI Big Data and People Project, are having that discussion at debates.europeanvoice.com.

    The discussion is happening in the context of a Facebook emotional manipulation study published by Proceedings of the National Academy of Sciences. The authors, from both academia and Facebook’s data science team, subtly changed almost 700,000 people’s Facebook feeds to produce slightly more or less emotional content to see how it would affect what users posted.

    Nigel Shadbolt, Professor of artificial intelligence at the University of Southampton, sees the upside and likens it to the agricultural revolution. Sure, some worried there would be a boom and bust and mass starvation - that is why the phrase Malthusian is still with us today - but that never happened. America alone produces enough food to feed the world now. Data, and therefore the science that needs to manage larger volumes of data, have the same potential.

    Software-Defined Storage may be part of that infrastructure. Traditional file- and block-level storage have gotten us to where we are but the future of Science 2.0 may be in object stores and software-defined storage mechanisms and new “data-defined” storage techniques.

    File-level storage works well with traditional structured data, such as hard drives but as data volumes increased, many customers found success using block-level storage, where data volumes are virtualized across groups of devices that make up a storage area network. 
    Object-based storage mechanisms, which break data away from the file-based hierarchies and assigning the object a unique identifier and stores complete the virtualization of data from the underlying storage device, could enabling scalability that is theoretically unlimited.

    Since object stores typically run on clusters of commodity hardware–as opposed to proprietary appliances with the big-name SANs–they bring big cost benefits to the equation.

    Comments

    Science 2.0 is quickly changing every aspect of science research - and our lives. With all the information out there - we now have the potential to be a global community. I definitely believe the digital era is influencing our health care options.

    http://www.cennamology.com/home/-science-20-facebook-genomics-and-new-sc...

    Sorry, but too much of this sounds like hand-waving and marketing. Using terms like virtualization are used to imply magical solutions, especially when the obvious issue of actually having a physical record of data is still a requirement. This is further exacerbated by the fact that this data needs to be secured, backed up, etc. There are no free lunches, and this isn't simply about a user interface.

    Solving the issue will take a combination of enterprise architecture, communication and collaboration among healthcare ecosystem players.

    Not to mention a minor miracle. No one has done anything to seriously address these issues for decades and most of the talk is wishful thinking rather than serious endeavors. Few companies have a good handle on security. Many have minimal backup/recovery capabilities for disaster protection. System performance and availability requires huge investments in technology, and support and is unreliable in vast areas of these interconnected environments.

    Basically they talk a good game, but they deliver very little. The idea of integrating health care information has been talked about for a long time. There is no reasonable project on the horizon that is anywhere close to actually achieving these results, even in the short-term. They completely overlook the largest component of data that will be impossible to regulate, so I don't see any of this happening any time soon.

    After all, a mature information technology would not be subject to such simplistic hacking. The simple fact that identity theft is one of the fastest growing crimes shows how little organizations validate the information they receive and how little they invest in protecting what they do have.