CIOs Slow to Embrace Cloud Computing

CIOs Slow to Embrace Cloud Computing

If I were a CIO listening to industry pundits muse about what to call my job when my day-to-day tech management skills were no longer needed, I wouldn’t be in a hurry to switch to cloud computing either — not in this economy.

I agree, it seems everyone selling hosted systems want to reduce the companies costs by claiming their product will not need the IT department. What about the computers the enduser uses to access your product? Printers, email, internal file servers, domain controllers and so on. Cloud Computing will need the IT department, in fact in enterprise cloud applications I see the IT department designing these systems with their departments with the vendors.

Just my thoughts….

Private Cloud VS Public Cloud Application Development

Private Cloud VS Public Cloud Application Development

One of the development challenges from going from a Private Cloud system to a Public Cloud system in Enterprise Application development is the ActiveX Data Objects (ADO) communication between your database and your client. In the Public Cloud we moved to ADO.NET 2.0. We had an advantage because we used this technology for years with our Mobile Applications, so it was really no big deal. However ADO.NET 2.0 is much slower than ADO 1.0. I know the security concerns are the reasons why we must use this but is this really true???  When we have to write SQL Queries accessing Millions of records in one search we need all the performance we can get. So I am all about security, but I feel slowing performance for security on the web server is just a wrong approach. We need Speed, and Security. It just shows that the Public Cloud systems are not looking towards the future, which is accessing large record sets, and providing the speed and security to perform these tasks. We at Adept have developed major work arounds that do accomplish what our clients need, however Microsoft and the other large players needs to move faster in providing the developer tools for this. Most of our applications work based upon the developer tools we have to write, because Visual Studio just doesn't have the tools to do it.

 

Backing Up Virtual Machines


Backing Up Virtual Machines

Great Article!

I cannot imagine a total loss of data. Anyone that is not backing up and backing up the backups is going to lose data.
Many years ago I was called to a County in Florida on the beach which was nice, that's Permits Plus database was corrupted. They had been down for a few months, and were issuing permits by hand. Since I had left Accela I hadn't worked on Permits Plus for a few years, and I never wanted to work on that system ever again. Anyway it was a database problem and I had them up and running in a few hours. BUT they didn't have any backups that were good. The backups were corrupted. So you can backup all you want, but you need to test those backups and make sure they are good. To make a long story short they lost 6 months of data, and had to reenter the permits by hand, which wasn't that bad. BUT if they had a backup that worked and it was tested by IT they wouldn't have lost any data. IT should have been fired on this one!

One of my favorite stories was Coweta County GA…..talk about Murphys Law. The IT director there was the best I have ever worked with, he is retired now, but boy what he did with his backup systems was smart. But a crazy set of events would have destroyed any normal IT department's basic Backup system, but not Coweta. Coweta did Tape Backups, and Backups to Backup Servers, along with other backups I will not talk about here. Their Payroll system went down, the production server died, the backup server was up, but the data was some how corrupted, so they went to tape backup system, the tape got ate by the tape drive, it died, then they went to the backup server system. It was successful but then it crashed and died too, but they were successful in getting the data to the correct backup production server. Everyone got paid! New Servers were ordered and Dell was shipping them ASAP. So the UPS guy shows up with the new servers and drops them off the Truck loading dock about 8 feet and the new servers were destroyed, Dell shipped a new batch of servers that UPS had to pay for, now as this mess is going on production is working with very limited backups. Very scary. However Cecil had already saw that this was getting too scary and made sure the backups were double safe, and moved them onto another server to double backup, after the tape drives were running again. What I am trying to say here, is you MUST have reduntant backup systems. With different modes of backup media. Just one isn't going to do it! A Human Being needs to be in charge. I have seen some crazy stuff in my career.

Microsoft’s slow-moving cloud may overshadow rivals

 

Microsoft's slow-moving cloud may overshadow rivals

I feel Microsoft will be a huge winner in On Demand Applications, from access to their Data Centers to providing their applications in the cloud, to providing their services to the end users, and to other software corporations. I do hope that Microsoft steps up to the plate more, and I have had a few conference calls with them on this issue and many more exciting issues. I see them as a Major Partner and a MUST HAVE in Application Development and Hosting in the Cloud.

 

Code Access Security (CAS) For Cloud Applications


Code Access Security (CAS) For Cloud Applications lets NOT Trust the Developers

<system.web>
    <securityPolicy>
    <trustLevel name="Full"    policyFile="internal"/>
    <trustLevel name="High"    policyFile="web_hightrust.config"/>
    <trustLevel name="Medium"  policyFile="web_mediumtrust.config"/>
    <trustLevel name="Low"     policyFile="web_lowtrust.config"/>
    <trustLevel name="Minimal" policyFile="web_minimaltrust.config"/>
    </securityPolicy>
</system.web>

Microsoft:

Today's highly connected computer systems are frequently exposed to code originating from various, possibly unknown sources. Code can be attached to e-mail, contained in documents, or downloaded over the Internet. Unfortunately, many computer users have experienced firsthand the effects of malicious mobile code, including viruses and worms, which can damage or destroy data and cost time and money.

Most common security mechanisms give rights to users based on their logon credentials (usually a password) and restrict resources (often directories and files) that the user is allowed to access. However, this approach fails to address several issues: users obtain code from many sources, some of which might be unreliable; code can contain bugs or vulnerabilities that enable it to be exploited by malicious code; and code sometimes does things that the user does not know it will do. As a result, computer systems can be damaged and private data can be leaked when cautious and trustworthy users run malicious or error-filled software. Most operating system security mechanisms require that every piece of code must be completely trusted in order to run, except perhaps for scripts on a Web page. Therefore, there is still a need for a widely applicable security mechanism that allows code originating from one computer system to execute with protection on another system, even when there is no trust relationship between the systems.

More on CAS:

Although it can seem daunting to try to move a web application developed under Full Trust to Partial Trust, it is actually easier than it seems.(What a Pain) The idea is to make the web app as ‘dumb’ as possible, moving business logic to a facade assembly and using Assert statements to stop the stack walk. A custom permission helps to authorize upstream code.

This approach means that any malicious code uploaded to the web application folder is restricted by the lower trust level, and the security policy also meets the required baseline. At the same time, the functionality of the application has not been altered.

CAS defines permissions sets operations that can be done by a set of code. To do this, CAS identifies and characterizes the application code so the appropriate permissions for that code can be determined. So defining how you want your code access security to work is all about defining what you want an application to be able to do and not do and then telling .NET how to figure out whether the code gets the permission set to work.

What I Think:

I think CAS is a good system, but it is not completed, or provides many options (only 5) for Cloud or On Demand Applications Smart Security, it in fact creates more obstacles than protecting the web server and network. If your application has bad code and allows code injection you have a lot more problems than what this tries to help. Plus I like Smart Web Applications for many reasons. Most Hosting Sites uses the Medium Trust level for their Web Hosting Packages. So I love the "Lets Develop for the Medium Trust Level".  Here are the 5 Trust levels for On Demand Applications we will need a lot more than the 5 listed below.

Trust Level Key Capabilities and Restrictions
Full No restrictions imposed by code access security.
High No unmanaged code.
No enterprise services.
Can access Microsoft SQL Server and other OLE DB data sources.
Can send e-mail by using SMTP servers.
Very limited reflection permissions. No ability to invoke code by using reflection.
A broad set of other framework features are available. Applications have full access to the file system and to sockets.
Medium Permissions are limited to what the application can access within the directory structure of the application.
No file access is permitted outside of the application's virtual directory hierarchy.
Can access SQL Server.
Can send e-mail by using SMTP servers.
Limited rights to certain common environment variables.
No reflection permissions whatsoever.
No sockets permission.
To access Web resources, you must explicitly add endpoint URLs—either in the originUrl attribute of the <trust> element or inside the policy file.
Low Intended to model the concept of a read-only application with no network connectivity.
Read only access for file I/O within the application's virtual directory structure.
Minimal Execute only.
No ability to change the IPrincipal on a thread or on the HttpContext.