I have once before put down some thoughts about computing devices and the situation for scientific use of computer technology, hoping to get some response and start some fruitful discussion. It remained with the hopes, some comments appeared there, but not really in the direction I think is important.

I will try once more, now with a little history. The big computers of the sixties and seventies became the minicomputers of 70's and 80's, which made place for the personal computers of the 80's and finally the iDevices (iPhone, iPad) of the latest year(s) (and naturally the Androids). During all this development until recently the computing devices have been primarily just that, computing devices. Starting from the iPhone, the whole industry seems to follow the 'great idea' of iPhone: the user has technical abilities of about 3-5 years old child, let us prevent the user from himself or herself - and seal all the general functionality of the devices as a default. Developers can develop anything, but you must be 'a developer' to change the behavior of the devices in any useful way (it is not so difficult to become a developer, but it might cost some $ 100 in that or other way once ore several times). 

Instead of programs or 'software' there is now new word: apps, one app for one function. Combination of apps and data flow and all that is gone.

One more thing has changed: now the only 'allowed way' to communicate with these devices is by fingers - one finger, two fingers, three fingers, even four fingers (perhaps even five fingers?). No analog of pencil or pen, but fortunately still by a virtual keyboard.

Well, what is the problem? we still have the 'real computers', desktops and laptops, to do the real work! The iDevices and Androids can be used for fun and social computing, the 'traditional' laptops, desktops and heavy computing devices for real work.

Well, we still have those 'traditional' computers, but the whole landscape could be entirely different, including the costs, the flexibility, the usability. I think first about scientific (and professional) use, but it would be relevant also for private and domestic functionality. What does stop any alternative development?
Well, here is the statement: it is our fault. We scientists have failed in this important period of technological development. We have left the playground completely opened to the commercial actors to explore our needs, create our needs and milk the public for all the money without really providing reasonably efficient use of IT.

Some would say that my last statement is completely wrong - look what we have. Oh yes, I am looking, that is why I write this. I will definitely come back to this subject, so now only in short. You can have plug-in computers for about $50-$100 spread around the lab or your home which could be simply programmed by you to do many tasks. There are programming techniques suitable for children and grown ups alike - visual programming. Instead of today's situation when some of the most creative brains work on antivirus software and breaking the various protections we could have a different one - when all those people would work on positive tasks.

Back to the commercial actors: it is not only their fault that they are making us more stupid. Some of them really have only one task - make as much money as possible, preferably more. Well, they do not hide that, they boast about it. Some of the more 'romantic' views of Science give the scientist the opposite role - serve knowledge and knowledge dissemination in the first place.
The use of IT today is definitely not what it could be. The main task of IT today is to 'generate revenue' - just check. And that is not good, not what it could be. So what do I think that scientists should do? I have many ideas in this direction, perhaps too many, but the first is simply start thinking and talking about it. Just telling about how you use IT and how you would prefer to use it instead.