Banner
    Microsoft Surface Pro reviewed from a scientist perspective. This is the future of personal computing.
    By Hontas Farmer | February 24th 2013 10:45 PM | 39 comments | Print | E-mail | Track Comments

    With the Surface Pro Microsoft has made a statement on what a PC should be for the next five to ten years.   Rather than fading away the PC will become thin, lite, and easy to use while retaining the flexibility that has always been the hallmark of the personal computer.   That said, Surface Pro is not perfect for all uses.  These are my impressions after spending a week with the Microsoft Surface Pro 64 GB. 

     

    I have been an avid Linux user since 1997.  M$ has been part of my life only because I like to play games and they have the best platforms for gaming.  With the surface pro they created the best mobile computer on the market today.   This is not a clone of the iPad.  This is a statement on what the PC should be.  Surface pro is a declaration that the “Post PC” era hasn't come and will never come.   

     

    For the longest time I have been using an HP Touchsmart TM2 which weighed about 4 lbs.  Surface pro has better specs by far and is half the weight.   Everything my old convertible tablet could do this new device does better.  It even runs Ubuntu Linux, with most of the functions working, with little difficulty.  (Only the wifi is lacking but that will come along with future Linux kernel releases).  This PC can do everything I need a PC for on a day to day basis as a science student.  With the optional “type” cover I can write Latex or Matlab code on it.  I can take handwritten notes on it.  I can make a PowerPoint presentation on it.  Scientist should consider un-tethering from their desk, and still working to be a real option.

     

     I have had an Android tablet.  They generally can’t be used for productive content creation.  In fact all Android and IOS are good for is buying content created by others.

     

    The storage sizes available for Surface Pro are kind of small compared to today’s hard disk drives.   That said, look at how many SD cards and USB pen drives we technophiles have laying around.  Look at all the different cloud storage options we have.  There is no reason that the vast majority of files and even programs needs to be on the system hard drive.  

     

    One of the best and worst aspects of the Surface pro is the screen.  On one hand it is very sharp.  Retina display like sharp.  On the other it is small compared to most PC screens and it can be hard to read.  For that reason alone I would suggest that if one has a Surface Pro or any similar tablet they should at least keep their old laptop or desktop for the times a big screen is essential.  I would not want to watch a ripped DVD on surface pro unless I was watching alone.


    Improvements  I would like in the future.

    I would like to see a docking option which would involve inserting the surface pro into a large touchscreen.  This would contain more storage for playing 3D PC games and movies.   

    I would like to see tablets made with better graphics options.  The Intel HD4000 is better than the stand alone graphics in my pc from three years ago by far.  However it pales in comparison to the best available graphics chips.    

    In the future the main storage of these devices should be user serviceable and upgradeable.

     

    In short surface pro is a good buy as a computer for real world science and educational use.  Microsoft has demonstrated decisively that the PC as we have known it since 1980 isn’t going to go away.  It will only change its shape.    

    Comments

    You nailed what every other computer scientist and serious computer pc activist believes and summarized it. I look forward to the day a surface that has a good strong tegra gpu in it with linux company in it too. I hope Microsoft reads this and does this soon. I am quite sure they will. They are just making an Intel approach and gradually getting to the perfect tablet. But first making all the money they can while almost getting there. Funny how the non pro has a tegra gpu but the pro does not like the tegra could not fit lol. The next one has got to be a gamming tablet there is nothing left in between and the tegra can do it. Maybe it can have that new tegra feature that syncs with the 600series GeForce thanks for summing up to Microsoft what we all want in a very straight forward summary. Microsoft do it now pleases

    Hfarmer
    I can see an ARM or a very good AMD Fusion processor being the heart of a sort of Microsoft X-Tablet.  A device which could replace or complement a future X-Box.  Arm processors of today are likely able to to much much more than play Angry Birds.  
    The main reason to go for Intel in the pro is to use all the legacy software.  I can run Linux software from 1997 on this thing.  Great software has a way of lingering long after it's not being actively developed.  

    Science advances as much by mistakes as by plans.
    Gerhard Adam
     I can run Linux software from 1997 on this thing.
    I found this a bit amusing.  Perhaps when these systems can run software from 1967, then they'll understand what compatibility and portability mean.

    In any case, don't mind me ... I'm having a bad technology day and I'm just getting a bit tired of these modern day systems that are resource intensive, function poor, and lack the simple concepts of availability and security.  Most of them simply demonstrate how readily developers can waste decades and still not get it right.
    Mundus vult decipi
    Ladislav Kocbach
    I am writing about another use of small computers and touch - perhaps it could help you to get through a bad technology day? I write about my playing with raspberry Pi - see http://www.science20.com/physics_and_computing/blog/tablets_creativity_and_science_%C2%A0and_raspberry_pi-104519. You can compile things from 1967 if you have the FORTRAN code. There is a gfortran on my little raspberry talking to a cheap tablet via a wire (I can now buy some $15 toy to get it wireless too).
    Yeah arm processors are really dominating but I think for gaming to me being compatible with legacy programs are important. I can't wait to be able to use this with steam on my off time and eclipse program for work. Arm probably wIll win the race in power in the short term due to there brilliant. Designs but legacy programs are important to me. I want the mobole world to merge with the developed pc world. And Intel has had a blueprint to superceed sillicon for years now there just staying slightly a head to get there moneys worth in the mean while. In the long run Intel will still be around and lead it was just unexpected and brilliant for a company like arm to design processors per specific need to maximize needed performance instead of just everything

    I want the mobile world and the current pc world to merge so the legacy portion of Intel compatibility is important. Plus with Intel already having blue prints for the future beyond sillicon I think Intel needs to take the lead for Microsoft's current os to last. Arm is amazing right now but for the long term future and the compatibility of the past I really like the combo of Intel and tegra.

    Hfarmer
    Me too.  Their is no logical reason we should all walk around with one really powerful computer.  Sit in front of another really powerful computer, then entertain ourselves with yet another really powerful computer.  With a little imagination a company could sell us just one such computer, which would then dock to other devices to perform those functions. 
    Where's the money in that?
    Science advances as much by mistakes as by plans.
    Oh yeah and the tegra 4 gpu is the only mobile gpu that I know of that can sync with a desktop gpu so if they were to make a portable Xbox to go hand and hand with console Xbox that feature comes in handy.

    MikeCrow
    Is Surface Pro a derivative of Win 8?
    I just went and looked, it is. Which means you can build as big a system as you want running Surface, the difficulty is making it Mobile, which will happen in time, but how big a display do you want to drag around with you?
    Never is a long time.
    Hfarmer
    The MS surface pro  (hardware) run Windows 8 pro (software).  While one can run windows 8 on any PC the surface pro is more than just a generic tablet.  Think of all the best aspects of the ipad.  Think of all the best aspects of a MS windows notebook.  Then combine them.    That is surface pro.  
    Only a few companies are doing a windows 8 desktop correctly.  HP and Samsung have full sized touch screen all in one PC's.  If one only works at a desk they are just fine.  :)   
    Science advances as much by mistakes as by plans.
    MikeCrow
    I do understand that. But for some usages, the computing power you can put into a mobile platform isn't going to be enough. Though the Samsung does looks pretty cool. I could see it as part of a living room. Or a school desk.

    But for me it would need a big honking server connected to it. But you could do much the same, have a huge server that you remote desktop to with your Surface. But, if you need to type something, it has to reduce display space, or you have to have an external keyboard. Someone who types a lot of code isn't going to like that much. That's sort of my same feeling about voice recognition, I don't see a bunch of people in a cube farm writing code by talking to their computer. But there are a lot of applications that either of those user interfaces would probably work well with. That was sort of my point.
    Your point about Android and iOS iPads is true, but there are a lot of people who don't develop.
    Never is a long time.
    Hfarmer
    I see what you mean.   Especially for workers in cubicle land.  A keyboard and mouse will be the standard in an office setting, coding, data entry, etc for a long time to come.    At least for me, a tablet allows me to do my work with a computer where ever I am comfortable.  My field crosses the line between pure creativity and coding/office work.  
    What I think of when I look at my surface and voice command it is this kind of thing. 

    But for me it would need a big honking server connected to it. But you could do much the same, have a huge server that you remote desktop to with your Surface.
    Kind of like the main computer on the enterprise. 



    The MS Surface has remote desktop ability.  The only reason I don't use it is because the processor in this thing is better than the one in my old laptop.  Perhaps if I have a 12 core beast with terabytes of space , tons of ram and dual video cards for crunching numbers I will use that. 

    Now I just need a device that can make me both a turkey sandwich and hot Earl Grey tea when I ask for it. 
    Science advances as much by mistakes as by plans.
    MikeCrow
    Exactly, and since it is Windows, it has remote desktop on it.
    When you want to find a comfortable spot and work on/look at experimental data, Surface sounds like a great idea. I can also see voice recognition getting good enough to do work a secretary might do. I can also see it used for reviewing drawings and such on the manufacturing floor.

    Now, I have a 6 core 3.3Ghz 3rd Gen Xeon (i7 class), 64GB ram, and ~3.5TB SAS 15k rpm RAID system to crunch my numbers. And can drag my laptop downstairs rdp to my server and keep working, it's just not as nice as when I have a dual screen, mouse keyboard to work from. Though I keep pestering my boss for the second processor and a couple of really big drives :)

    I'm not sure how much smaller Field Effect Transistors will get, but it won't be long until we can start building room temp quantum devices, so I'm guessing 16-128x levels of integration minimum still to go.
    Never is a long time.
    UvaE
    In fact all Android and IOS are good for is buying content created by others.


    Hfarmer
    "like".  
    Science advances as much by mistakes as by plans.
    Hfarmer
    One update.  I reviewed a 64 GB version of surface pro.  In order to really play a game on it and be able to dual boot Linux I needed more space.  So I exchanged it for a 128 GB version.  If one does not perform either of those task the 64 GB version of Surface pro is more than sufficient. 
    Science advances as much by mistakes as by plans.
    As for keeping an old laptop and desktop for screen size; You can easily have a 27 inch monitor + usb hub with wireless mouse and keyboard at home and use the surface's display port output (could even use your tv) and it then becomes your desktop. Unless you need gaming it should be a very capable desktop machine.

    I've got my rig for games already (built with 3770k, 670gtx, 240ssd 24gb ram, etc) but can't wait to get a surface as my "laptop/tablet". Honestly I've been waiting for this since the early 00's and the ipad made me just want it more. The ipad is just a large iphone/toy. Yes I'v used it for surfing and mobile games but that's it. I want the ultimate all in one device!!! The surface is def for me - now if only I can find and order a 128gb one!!!

    Hfarmer
    As for keeping an old laptop and desktop for screen size; You can easily have a 27 inch monitor + usb hub with wireless mouse and keyboard at home and use the surface's display port output (could even use your tv) and it then becomes your desktop. 

    That's a very good point.  With it's USB 3.0 that is a real possibility.   I have an old USB2.0 port replicator that I've used with my old PC.  It might do the trick.   
    I've got my rig for games already (built with 3770k, 670gtx, 240ssd 24gb ram, etc) but can't wait to get a surface as my "laptop/tablet". Honestly I've been waiting for this since the early 00's and the ipad made me just want it more. The ipad is just a large iphone/toy. Yes I'v used it for surfing and mobile games but that's it. I want the ultimate all in one device!!! The surface is def for me - now if only I can find and order a 128gb one!!!


    Yeah the ipad is just a toy compared to a real PC.  Eventually all those people on iPad will want more.  Then Apple may just go the way of Commodore. 



    Science advances as much by mistakes as by plans.
    Thanks for the review, I found it very useful. I briefly played with the Surface Pro at a store and I liked it. It is also good to know it can run Linux.

    rholley
    Recognize me among this lot?



    (Main picture: Gibraltar Neanderthals, Wikimedia)
    Robert H. Olley / Quondam Physics Department / University of Reading / England
    Hfarmer
    That gave me a genuine laugh. :)  
    I'm sure somewhere someone is still using Windows 3.1 on a 486. 
    Science advances as much by mistakes as by plans.
    I am a computer scientist graduate. and first of all I believe in a few years from now these tablets will be capable of powering everything the strongest desktops can. we may no longer meet mores law any more but every year we can shrink the same power in 1/4th the space it used to need. shrinking processors seem to be the future. Now after graduating college me being a system administrator and my twin who also went into computers science both agree the idea of a server is primative and is also soon going to change too. Technologies like the illegal torrenting is now soon beginning to be used legally for local document availability and sharing which is a huge advantage since now you dont have the bottleneck of the single ethernet wire speed and hard drive in the one server. My twin and i while in college used a system for file storage and sharing that at the root was more like torrenting but seemed to the end user like it was just like accessing a file on a server. all the computer scientists one year worked on this system so we could all collaborate on work together easily since we did not have access to server space there. We developed it well enough so as log as 3 or more people had the full version of the files you could even dynamically host a website on this system we made. This is also faster then any single server since you have many hard drives and ethernet wires working together multiplying the access speed of the file you want to access. since most of these tablets have ssd's they will only benefit this technology that will superseed servers. and virtualization and sharing processing power has always been capable over lan networks with out the need of one server to manage it. So once technology of local networking evolves which i have already seen as a young graduate already been done a bit to be more efficient in data access i think these tablets will be able to do more then most expect and may some day make the desktop extinct. in 5-7 years from now i bet these tablets will be as powerful if not more powerful then our strongest desktops today and capable together of doing more then were are currently capable to do over a lan.

    Gerhard Adam
    It's interesting that you would think the concept of a server is primitive.  Here's my prediction ... there will always be a server function.  Always.

    The notion of data sharing [for update, as I believe you're indicating it] is almost impossible except under the most ideal circumstances, since data coherency is intrinsically difficult and unreliable.  Loss of communication alone will kill such a system.  Successful implementations that address coherency problems dont' scale well because the overhead is enormous.  Hard drive and ethernet speeds are far too slow to be of much value, except for small applications or a select user base.

    Mundus vult decipi
    Hfarmer
    Well I agree and disagree with both of you.  
    The idea of using something like torrenting to share files on a network is a good one.  For example, why not distribute a document in your organization to a number of seed clients.  Then have any future clients download that document from those seeds.  In such a system the "server" as a bit of hardware is eliminated.    Just as the "main frame" when the way of the do do . 

    On the other hand there are some servers which share out finite physical resources.  Things like printers, faxes, and scanners will always be in limited supply.  They will also always be needed because sometimes nothing is as good as paper. 
    Science advances as much by mistakes as by plans.
    rholley
    ... sometimes nothing is as good as paper.
    Or parchment, even!



    When that April, with his showers sweet,
    The drought of March hath piercèd to the root,
    And bathèd every vine in such liquôr,
    Of which virtue engendered is the flower;
    The first lines of Chaucer’s Canterbury Tales, spelling modernized.



    Robert H. Olley / Quondam Physics Department / University of Reading / England
    Gerhard Adam
    Just as the "main frame" when the way of the do do .
    See, it's statements like that which demonstrate how little most people understand about computing.  Check the link I provided, so that you can see what "mainframe" computing is all about.  Without "mainframes", businesses truly would be sitting around campfires pounding out their messages on rocks.
    On the other hand there are some servers which share out finite physical resources.  Things like printers, faxes, and scanners will always be in limited supply.  They will also always be needed because sometimes nothing is as good as paper.
    I'm really surprised at this kind of reasoning.  What do you think it takes to maintain an inventory and ordering system for a company like Walmart?  Even something like printing sounds incredibly naive because of the failure to recognize the volume of paper that is generated.  While there are many options today for people to pay bills online, what do you think it takes to generate bills monthly for millions of utility customers?  or bank statements? 




    Mundus vult decipi
    Gerhard Adam
    I finally found some good images to convey enterprise level printing operations.  Most large corporations have several of these types of printers or centers, but I thought this would help folks understand more precisely what large-scale computing is about.

    If you go to the provided link, you can watch a short 2.5 minute video of a printer in action.  It is noteworthy that running this printers is potentially life-threatening if one isn't careful.  Take note of the size of roll-fed paper [weighing in at 1000 lbs].  The paper is actually running through two printers for full two-sided printing [both sides of the paper].  If you look closely you can see where the paper gets turned over. 

    When you see the printer running at 720 pages per minute, you can get a better appreciation of what large means.  It is especially interesting to watch the printer about half-way through the 2.5 minute video to appreciate how fast full-page laser printing can be.  BTW, the noise you hear near the end, are the individual sheets being cut and stacked.

    http://www.biline.ca/4100.htm

    This is the printing subsystem for mainframes (z/OS) as well as UNIX and Windows servers.  However, I expect that this is sufficient to demonstrate why the idea of servers being obsolete is a rather quaint notion.
    Mundus vult decipi
    Hfarmer
    Did I not say that printers would always require servers because they are in limited supply? Yes I did. 
    Files on the other hand can be distributed across the plurality of hard drives, and other storage devices which any enterprises network has available.  Just consider the gigabytes of space in the cell phones of a large corporation, plus the storage on all their laptop, and desktop PC's.  With some clever programming I'll bet that file servers could be done away with. 
    Science advances as much by mistakes as by plans.
    Gerhard Adam
    Did I not say that printers would always require servers because they are in limited supply? Yes I did.
    ???  What does limited supply have to do with anything?  Do you think there aren't enough printers available? 

    I had hoped that with the printing you would see the sheer volume of data that was being produced [and this is post-processing].  This says nothing about data coherency issues, security, and disaster recovery requirements. 
    With some clever programming I'll bet that file servers could be done away with.
    They can be eliminated for trivial problems.  Backup/recovery and security are more than big enough problems to demonstrate that it can't be done for important data.  One does not improve systems reliability by increasing the points of failure.

    This is one aspect of what is currently being struggled with regarding medical records, and despite the optimistic claims, you can already see the writing on the wall.  Computing projects reflect a long history of optimistic claims and expensive failed systems in their wake.

    I found it interesting that the previous commenter placed so little concern on the issue of lost data.  It simply illustrates a failure to understand the problem.  Similar comments to other articles illustrate the same kind of "blind spot" where simple questions like ... who is to be responsible for backing up data aren't even answered.

    It is easy to claim that there is no worry because multiple copies will exist amongst users, but how will you find them?  More importantly which versions may exist?  However, the biggest problem is to answer the question of how can you be confident that not finding a document is synonymous with the document not actually existing someplace?  These are basic problems and yet I find it interesting that they are passed over with glib comments about how it isn't necessary to worry about them.

    More of these systems that are developed as college projects suffer from confirmation bias, since the individuals involved invariably test things to see how well they work and to orchestrate specific scenarios for testing.  This is precisely why most of these systems fail in the "real world", where all manner of unexpected events actually occur. 
    Mundus vult decipi
    MikeCrow
    With some clever programming I'll bet that file servers could be done away with.
    They won't be, it's not a programming or space problem. As I said below it's a security issue.
    Never is a long time.
    Gerhard Adam
    This is also faster then any single server since you have many hard drives and ethernet wires working together multiplying the access speed of the file you want to access.
    This simply demonstrates a lack of understanding in how the technology works.  When one can have RAID devices attached on 8 or more 100 MegaBYTE fiber optic channels, your ethernet connection is more like smoke signals than actual computing.  It is important to note that these 8 channels are connected to one server to minimize queuing delays, etc. 

    However, parallel I/O operations can't exist in practice, because memory access is serialized, so the notion that having multiple paths from multiple sources for a single piece of data is simply a huge misunderstanding.  As long as each piece of information is a discrete piece of data that is broken up and re-assembled [like a TCP/IP packet] then there is a single point at which this data can be reassembled before it is stored in a system.  However, for large scale file activities, this would be a huge step background and incur significant amounts of overhead [which is precisely what it already does in TCP/IP networks].
    Mundus vult decipi
    no we have done this already at a school. it is not nearly impossible at all. loss of data is not an issue since the originator should not loose there file unless they intend not hold on to it anymore and delete it from there device. as long as there is one full file it is still available just only as fast as a normal server. people do not always need to be connected to the system for the file to be accessible because the way this works is the server multiplies all the files accross every device that connects to it. not only the ones you want access too. and then deletes them once you dissassociate your self with it. This technology works very well when the device are in a work scenario that is usually used often comminicating to one lan or to the lan over the wan. Its obviously not ideal for BYOD but for business and school scenarios where servers are needed and the devices you use are given to you not your own it does not matter what data is on the device just what you have access too. kinda like how skype uses clients to relay information in the background that has nothing to do with you. not all files will be saved entirely to every device but a certain amount to a certain amount of devices and rotated according to efficiency and use. we made a cluster of 10 of our student computers do this and it was a success. there was never a worry about a loss of data. and with acls and other permission stuff it works really well but security is still the final part we were still working on for a real office use. its by far not impossible. in most businesses they have petabytes of storage accross there offices but only use terabytes on the servers. this redunancy accross petabytes still leaves them more space to use and works accros a lan and wan network more efficiently.

    Gerhard Adam
    ...loss of data is not an issue since the originator should not loose there file unless they intend not hold on to it anymore and delete it from there device.
    Who is this originator?  Most business systems have long-term databases and files for which there is no "originator" in the sense you're using it.  If you're only talking about small personal files, then it's strictly small-scale.  Given that many enterprise files may be Giga-bytes in size or larger, then there is no feasible way to move those across a network in the manner you're describing.  More importantly, security becomes impossible and supporting synchronizing data updating is a preposterous notion.  You can't even reliably control serialization of access.  You'll overload every processor and network on which you're connected.

    As far as not saving entire files ... again ... this is already the concept behind caching, so it's nothing special, however even in the simplistic implementations typically seen on PC's it's hardly reliable and does not result in high performance.
    ...there was never a worry about a loss of data...
    ???  There would ALWAYS be a worry about losing data.  In addition, data corruption is another issue that would have to be addressed. 
    ...this redunancy accross petabytes still leaves them more space to use and works accros a lan and wan network more efficiently.
    I don't think you have actually seen what a large enterprise uses.  Those petabytes of storage you're referring to ... in most large businesses does NOT represent much of the actual business data.  That is generally not distributed on LANs.

    I guess I should ask, what you consider to be a "large" server?  How many processors?  How much memory?  How many attached devices?
    Mundus vult decipi
    have you ever used torrent before my friend. over a wan they can distribute files gigabytes large let alone over lan speeds. secondly the problem with torrenting is the fact nothing ever gets lost thats how the illegal world survives. Secondly I am a network and system engineer of a whole college with 5,000+ employees and 20,000+ students i am well aware of how big servers are and how needed they are. I manage and set up the same strategy we have always been using in the past. i set up and manage servers from 1 cpu to 8 cpu's for web based stuff and simple file sharing to big cluster based virtualization serves with over 30 intel xeons. and work with setting upcisco based networking routers and switches. I also work with even the old mainframes they use and keep it running. and these system are still bottlenecked by the speed of the fiber connected to them and the drives in them compared to the redunant system we worked on in college. I work at the same college i had recently graduated from and. I know the system we created was just as efficient if not more when it comes to money then 50,000 dollors of equipment would get me know working as a system administrator in hardware. I am not oblivious to whats out there i am just aware of the research that big top of the line universities are doing and what effect this research may have on the way computers evolve

    Gerhard Adam
    As I said, it's a relatively small configuration. 

    This link is an example of enterprise level computing, and believe me ...these companies don't have just one of these.
    http://public.dhe.ibm.com/common/ssi/ecm/en/zsq03058usen/ZSQ03058USEN.PDF
    Mundus vult decipi
    computer science always starts of with a concept that once seemed difficuilt and not easy to do but practical in the end result and after many revision of algorithms the program makes something possible. yes today with out the right software doing this theoretically loss of data and stuff could be an issue if programmed poorly. but its not really that hard to overcome either. checksum files prevent data corruption, encryption and acls give security, data loss is not as long as the devices on the system are owned by the corporation and stay there for many years. first of all caching is no where near related to this. disk drives dont cache things like this and most networks dont have packets comming from multiple sources caching is only semi relative to this over a network with redunant switches and routers which only barely use this technology incase one switch or router goes down. in new enviroments we use cut through swittches since its more efficient any ways and caching is irrelivant everything just runs theoretically on the 3rd layer. this is entirely different then reduntant networking for secure data routes this is for performance where packets can individually at every millisecond find there fastest route from the fastest destination. I am well aware of technology out there. and when it comes to the originator its the first person to save the file that simple. there is always one person to save the file first from there on it spreads out. its not about weather or not multiple people collaborate on a file at once or many people who are working to create a document. the computer world does not see people as people they see us as devices and the first device to have the file is the originator. now I think i have said enough. if you want to agree or not agree on it being a benefit thats your own choice. unfortunately this is after something has been mostly developed. But either way i am pretty sure some of my fellow graduates working at microsoft, cisco(my brother), and oracle, etc who worked on this project with me will some how eventually bring some of this technology into the future of these computer with many years from now.

    Gerhard Adam
    I am well aware of technology out there. and when it comes to the originator its the first person to save the file that simple. there is always one person to save the file first from there on it spreads out. its not about weather or not multiple people collaborate on a file at once or many people who are working to create a document....
    A document?  What are you talking about?  Who do you think owns the files for a bank when there may be well over a million accounts?  We aren't talking about people simply sending a few documents around to each other.  Just as an example JP Morgan Chase processes over 1 million business transactions per hour, every hour [24 hours/day; 7 days per week].  Over 7 billion business transactions per month.  That's what enterprise level computing is about.  It isn't a few people sending e-mails and WORD documents.

    It's also quite telling that you mentioned the "old" mainframe.  That tells me all I need to know.
    Mundus vult decipi
    MikeCrow
    This is not how a business would manage their product IP, I know because that what I've done for the last 15 years for hundreds of companies, many that you use and own products from.
    IP is not distributed to client systems, it would only be available to a client if they had a need (and in some cases they can only view documentation), and had the appropriate security. Additionally there are technology export laws that limits where IP can go (ITAR).
    They also don't clog their networks streaming gigabyte files everywhere.

    Businesses are very risk averse, and managed to make a profit. They don't usually jump to the very latest ideas, we have a saying "Pioneers catch arrows", the ones that do frequently get burned.
    Many of these companies have just adopted an electronic version of the paper process they developed before they had computers in many cases.
    But, some of what you've discussed probably has some value, and in time some amount could end up in customer deliverables. But unless you can offer huge savings with 3-6 month ROI's (and even then it can take decades for adoption), it will be more evolution than revolution.
    Never is a long time.
    Ladislav Kocbach

    In short surface pro is a good buy as a computer for real world science and educational use.  Microsoft has demonstrated decisively that the PC as we have known it since 1980 isn’t going to go away.  It will only change its shape.  


    Amen.