Who is Grace Hopper?

Categories

Who is Grace Hopper?

Ivy league PhD graduate. Mathematician. High ranking, decorated Naval officer. Software Developer. Computer Programmer.

Admiral Grace Murray Hopper earned all these titles and more throughout her career. Spanning over the course of 60 years, Hopper became not only a trailblazer for women in tech, but an innovator in the many fields in which she practiced.

Without Grace Hopper much of the work we do at Storagepipe wouldn’t be possible. Services like our, offsite backup and server backup, white label backup reseller program and data protection simply wouldn’t exist with the work Admiral Hopper did when she created the first compiler.

In 1943, Hopper enlisted in the U.S. Navy — a career that would earn her the titles of Commander, Captain, Commodore and Rear Admiral – Lower Half, not to mention a total of eight military awards. Hopper retired from the navy at age 60, but was called to return twice. She retired for the last time at the age of 79.

Hopper was born in New York City and earned her PhD in mathematics from Yale in 1934 at the age of 28.

In 1944, Hopper helped to build Mark I, the first computer in America. In the same year, she lead a team that would solve an equation to make the atomic bomb function in three months for the Manhattan Project.

Five years later, Hopper’s role in the development of UNIVAC I — the first commercially produced computer — would lead her to develop the first ever compiler, a computer program which transforms complex source code into binary code. In 1959, Hopper assisted in the development of COBOL, an easily understandable computer language for business computer software.

Following her final retirement from the Navy, Hopper appeared on David Letterman in 1986. She was received with a standing ovation from the audience for her famous demonstration of a nanosecond in which she asks Letterman to hold an 11.8 inch piece of cable to show the length electricity can travel in a nanosecond. Letterman referred to her as “the Queen of Software.”

Following her passing in 1992, the Grace Hopper Celebration conference was born. The annual conference is a three-day event which takes place every October. The conference celebrates women in computing and the contributions women have made to this field throughout history.

Hopper Hall, set to be completed in 2019, will be the first military academy building named after a woman. The building will be a cyber facility, in honour of Hopper’s career as a computer scientist.

It’s clear that Admiral Hopper’s role in computer science and the American military make her one of the most important historical figures of her generation and a shining example of the role women have played in history’s most important technological advancements.

Add this to your own website or blog

To add the infographic to your own website or blog, just copy and paste the code in the textbox below or simply click the “Copy to Clipboard” button and add where you’d like the graphic on your website.

Questions? Ask Our Experts!

How (Not To) Get Free Unlimited Online Backup – DIY

Categories

 

Transcript:

How would you like to have free, unlimited online backup for the rest of your life? What we’re about to show you is a “Do It Yourself” system that’s commonly used by people that want to cut corners and save money on their data protection.

In this step-by-step tutorial, we’ll show you exactly how to create your own “offsite backup” software from scratch – in about 3 minutes. All you’ll need is a remote FTP account – including username, password, and network address. That’s about as technically complicated as this tutorial is going to get.

If you’re technically savvy, you can create your own FTP server for free using old computers and surplus components. Otherwise, there are many other low-cost alternatives to get your own FTP account or server.

But first, we need to issue this warning: Storagepipe does not endorse this methodology. We’re only showing you this for educational purposes. After the tutorial, we’ll go back and discuss why you should never protect your data in this way.

In this session, I will be showing you how you can use Windows to back up a local folder to a remote server using FTP. I am using Windows 7 here, but any version of Windows will give the same result.

Please go ahead and open Notepad, then start typing your FTP information. Start by typing your user name. The way to do it would be to type in “user” then space, followed by your username. Go to the second line and type in your password. On the third line, please type in “prompt” and on the fourth line specify where on the remote server you would like the files to be uploaded to. So, you say cd/ and then your-director. Then type in “bin” and “mput” space, followed by where your backup folder on your local computer is. Let’s say you want to upload a folder on drive C. You will type something like this: “mput C:your-folder” and then make sure to put star (*) so it selects all the files within this folder. Lastly, you should go ahead and type “quit.” That is our script so far. Please go ahead and save this file and give it a name, any name; I am going to call it myScript. Make sure you give it an extension of .dat. Before you go ahead and save this file, make sure you save from the “Save as type,” select “All Files,” and hit “Save.” Now that we have these parameters for our FTP information and our local backup folder, we will need to run a process which takes place every day to upload by FTP using these credentials to remote servers. For this reason, we will be using task scheduler. Go to “Start,” “Task Scheduler.” Expand this. Right click on “Task Scheduler Library,” and “Create a Basic Task.” Give your task a name. If you want to go ahead and give it more description, you can go do so in the Description box below. Now hit “Next.” In this window, you will need to specify how often your script is going to run. It can be anytime — weekly, monthly, daily — and in our case we will be selecting “Daily.” You can also specify a time when your script is going to be running every day, so I am going to select 8:00 AM. Hit “Next,” and then make sure to select “Start a program.” Hit “Next” again and you are going to be prompted with this screen where you should input your program that should launch when this event occurs every day. In our case, we will need to run this batch script via command prompt. To do this, we will need to launch — we will need to specify the program as powershell, and we will need to pass in arguments. Let’s do so. In the Program text box, please type in “powershell.” Then, next to “Add arguments,” please go ahead and type in “FTP, space, -n, space, -s, colon (FTP -n -s:). Now, you will need to reference where your local file that you just saved, that batch script, here. It could be anything, but I have another copy of this same file which goes in drive D, so I will say D backslash (D\). I have saved my file as FolderBackup.dat. Next, hit space and specify your IP address or domain where the files are going to be uploaded into. Let’s go ahead and say “storagepipe.com.” Hit “Next” and “Finish.” After this has been created, you can locate it somewhere in these tasks and you can just click anywhere and start typing the task name. After locating it, make sure that you go to “Properties.” As you can see, there is something called “Run with highest privileges.” You will need to ensure this is selected and hit “OK.” I have here FTP client files that will open to the same directory where I am supposed to upload my files. As you can see, there are no files at this moment. I have created an identical task already, configured with my own local files — backup folder — and my own FTP server. So, I am just going to go ahead and test that this is going to be working. For this purpose, I will be selecting the task that I need to run right now because it runs every single day at 8:00 AM. To prove this is working, I am just going to select it, right click, and click “Run.” When you do this, as you can see, the command prompt will be open with the file upload process taking place. When it is done, it is just going to quit the script. If you go to your FTP client and “Refresh” the folder, you should see the files have been uploaded. That is all you have for now. If you have any questions, please let me know, and have a good day.

If this all sounds too good to be true, then it probably is. As mentioned at the beginning of this video, this is a horrible way to back up your data. There are many important reasons why, but it would take hours to list them all. So instead, I’d like to just highlight a few of the most important problems. 1) You have no way of knowing if the files were successfully transferred. We commonly see situations where companies have gone years without a successful backup, because they simply assumed that their backups were working properly, without any monitoring or verification. 2) You’re only creating a single backup version. Every time you back up, you’re overwriting the latest remote copy with an identical copy of what’s on your hard drive. If a virus corrupts all of the files on your local machine, your backup data will be overwritten with this corrupted data. Now you have 2 sets of corrupted files, and no valid backup copies. A proper backup solution should provide you with the ability to roll files back to a previous point in time. 3) This is an unsecure solution. Not only is the data being transmitted over an unsecure protocol, but the remote backup copy is also kept unencrypted. Theft of poorly protected, unencrypted backup media is one of the leading causes of data leaks and security breaches. 4) With this solution, you’re only keeping a single copy of your backups. If your remote FTP server crashes, you’ll be unable to back up your data. And there’s no system for alerting you that your backups are no longer taking place. This is especially true for people who use certain varieties of low-end, consumer grade Networked Attached Storage devices, which are notorious for breaking down. 5) Depending on your upload speeds and the size of your data, it could take days, or even weeks to transfer all of your files over. And if your FTP server is kept at a remote location (as it should be) then you might end up incurring overage fees and penalties for exceeding your ISPs transfer limits. There are hundreds of other reasons why this DIY approach is a terrible way of backing up your data. But despite these warnings, this method — and other similar approaches – continue to be commonly-used methods for ensuring automated off-site data protection. Yes, it’s cheap. Yes, it’s convenient. No, it won’t protect you. And for some people, 2 out of 3 is just fine for their needs. But, if you’re serious about protecting your data through off-site backup, you should look for a serious, business-grade solution that ensures your data remains properly protected at all times. At a very high level, here are a few things you should look for:

  • Data is stored in secure faculties
  • Backups are monitored for consistency, and technicians are there 24/7 to fix any problems.
  • Data is transferred securely over SSL, and remains encrypted while in remote storage.
  • Backups maintain many historical file versions, so that corrupted files can be rolled back to previous uncorrupted recovery points.
  • Only changed files are transferred, while ignoring files that have not changed since the previous backup cycle. This incremental approach ensures faster backups and more efficient bandwidth usage.
  • At least 2 backup copies are kept, so that you have backups of your backups.
  • Backups should be stored far away from the primary computers, ensuring that they are not destroyed in the event of theft or disaster.

But most importantly, you need a backup plan that’s custom-tailored to your needs. Talk to a backup professional, and implement a plan that’s efficient, automated and leaves you truly & completely protected. A single critical data loss incident is all it would take to shut down a business forever. Never cut corners when it comes to the security of your priceless and irreplaceable digital assets.

Questions? Ask Our Experts!

Cloud Storage 101 Ebook – Cloud Storage for Beginners

Categories

Thanks to the rapidly-growing popularity of cloud storage, we’re constantly being approached with questions about the best ways to leverage the power of the cloud.

That’s why we’ve put together this Coud Storage 101 ebook.

This guide is intended to give a high-level overview of the many different ways that cloud storage is currently being implemented, and how to make smart choices when implementing storage in the cloud. We’ve outline some of the core fundamental concepts, in a way that’s easily accessible to both – technical and nontechnical – people.

Click here to download your copy of Cloud Storage 101

Questions? Ask Our Experts!

The Critical Feature that Many Offsite Backup Services Shockingly Leave Out

Categories

https://youtu.be/E2VrkDPLskA

 

It’s hard to think of anyone whose personal and professional lives have not been deeply affected by the emergence of cloud computing.

  • Most consumers prefer to communicate digitally though services like Gmail instead of written letters
  • Facebook, text messaging and Twitter are increasingly becoming preferred over “old fashioned” voice phone calls
  • Skype has virtually replaced long-distance phone calls
  • Netflix poses a real threat to cable broadcasting companies
  • Businesses which had once relied primarily on paper-based processes have now become almost completely paperless

 

Since our lives have become so incredibly dependent on the digital data we produce every day, backing up our data has become an absolute necessity. All of us know how devastating it can be to lose 5 years of memories after a crashed hard drive or stolen laptop. And we’ve also experienced the pain and embarrassment of losing a crucial USB drive on the day of a big presentation.</p>

In the age of abundant and reliable Internet access, there is simply no excuse to lose digital information.

wipedata

Older methods of backing up to external hard drives have fallen into obsolescence. Experience has shown that users have poor backup habits, and that physical storage is just as prone to loss or breakage as the computers they are designed to protect.

That’s why many users today rely on cloud backup services to protect their critical files and information. But picking the right backup solution can be difficult. There are literally thousands of offsite backup providers on the market, and each one has their own unique approach to data protection.

When purchasing offsite backup, most consumers are primarily concerned with preparing for the absolute worst scenarios. These include things such as muggings or major natural disasters. But it’s also important to consider the more common minor events which contribute to data loss.

By far, one of the most common means by which data is destroyed would be accidental deletion. It’s very easy to accidentally remove a file from your hard drive without realizing it. And it might be months before you realize what you have done.

Unfortunately, some low-end online backup services… especially the free or very low-cost options… are not designed to protect users against accidental deletion. When it believes that you’ve intentionally removed a file from your hard drive, many of them will also delete the file from your backups within a few days. (This is usually done in order to cut down on storage costs)

Online backup can be seen as insurance for your data, and it must have the features and functionality to protect you from — at the very least — the most common form of data loss.

A premium backup solution should never delete your data without your explicit permission. Although the act of backing up should be fully automated, the process for wiping the data should never be done automatically.

Retention of deleted files is a simple feature, but it’s an important one that could end up saving you lots of headaches in the future.

Questions? Ask Our Experts!

Storagepipe Launches New CDP Online Backup Service with Improved Speed, Efficiency and Customization

Categories

Storagepipe has announced the latest edition of their popular CDP online backup software and industry-leading Service Provider Platform for white label partners. These solutions have been re-designed and enhanced to ensure the most optimized performance, cutting-edge innovation, and in-demand features.

Storagepipe’s new CDP release offers a truly comprehensive cloud storage suite, including continuous and scheduled backup, server protection, synchronization & file sharing, and multi-device cloud storage — all through a single service. And these services are all backed by premium infrastructure, zealous support, and extensive encryption & security.

The new CDP is also designed to elegantly solve next-generation challenges that come from the increasing trends towards teleworking, mobile computing and fragmented office environments.

File Synchronization and Backup In One Integrated Solution

Storagepipe CDP provides automatic synchronization of local files to the cloud and across multiple (Mobile, PC and MAC) devices. Users can also upload files for offline storage without keeping a local copy, infinitely increasing available capacity. By unifying online backup with synchronization, remote storage and file sharing, Storagepipe provides a truly complete cloud storage solution.

Simplified Management

Storagepipe CDP online backup runs as a service, allowing for fully-automated laptop, desktop, and server backups. Businesses can conveniently centralize the protection of their end-point systems and servers, through a single service.IT administrators can manage, monitor, organize and control all of their users and user groups through a centralized management portal. Users can also access their files and manage their own accounts through their self-service portal.

Premium Infrastructure

Storagepipe CDP is built atop of a premium backup architecture. Multi-threaded, encrypted, block-level backups happen incrementally and faster for any end-point device.

User data is client-side encrypted using strong AES-256 encryption, and transferred using a secure TLS connection for an added layer of security. Data remains encrypted while at rest on Storagepipe servers. For maximum resiliency, Storagepipe maintains multiple redundant copies of uploaded data, with the primary copy being kept on fast, resilient RAID disk architecture.

Most In-Demand Features

The new Storagepipe CDP incorporates some of the most requested user features, which are currently lacking in the marketplace.

  • In order to prevent accidental deletion, Storagepipe CDP retains deleted files in backups permanently or until they are manually purged.
  • For faster backup speeds, the new Storagepipe CDP offers multi-threaded, encrypted block-level transfers.
  • There are no limits on file types, locations or size which can be backed up.

 

Professional-Grade Support</p>

All Storagepipe services are backed by live, North American phone support from highly trained technical staff, with up-to-date industry certifications.  24/7 emergency support is available for fast resolution of critical business server outages and other data disasters.

White Label Branding

White label resellers now have even more features, flexibility and functionality for both the back-end web portal and the end-point software.  This delivers a truly unique and personalized white label cloud storage experience when offering these premium backup & synchronization services to end-users.

About Storagepipe:

Storagepipe has been a leading full-service provider of off-site backup and disaster recovery services for over 14 years. Storagepipe’s portfolio of online backup, business continuity, archiving and disaster recovery services are available either directly or through Storagepipe’s broad network of white label and wholesale partners. For more information, visit http://storagepipe.com

<img class=”alignnone size-medium wp-image-358″ src=”/blog/images/uploads/2015/01/screen2.png” alt=”screen2″ />

Questions? Ask Our Experts!

Storagepipe Online Backup Opens New Texas Datacenter

Categories

In response to customer demand, Storagepipe has expanded its operations and opened a new datacenter in Dallas, TX.

This new location provides Storagepipe with a more distributed geographic footprint for companies with diverse and complex cloud requirements.
Storagepipe’s Dallas datacentre offers state-of-the-art security and compliance features to ensure that information protected in the cloud is treated with a level of security that is equal to or greater than what customers may be receiving through their on-premises, self-managed data backup and business continuity processes… but for a fraction of the cost and without any of the inconvenience.

Key security features include:

Onsite datacenter security guards 24 hours a day, 365 days a year
Video surveillance and recording of the exterior and interior of each facility
Biometric and key card security for rigid access control
Turn style doors to prevent tail-gating
Reinforced physical structure including concrete bollards, steel-lined walls, bulletproof glass and perimeter fencing
Dedicated data halls, suites, and cages for customized solutions
Facilities in compliance with FISMA, HIPAA and PCI DSS

Benefits for Oil & Gas and Energy

The Oil & Gas and energy industries have undergone a dramatic transformation within recent years. Organizations in this market are becoming increasingly reliant on data, and the leaders in this space are those who are able to extract business value through the collection, analysis and sharing of information.
Storagepipe’s new datacenter is located within — and connected directly to — one of the most important data interconnects for the Oil & Gas and Energy industries. This makes Storagepipe an ideal, cost-effective and convenient option for protecting data for companies in these industries.
And with many industry customers already using Storagepipe for years, the company has significant experience in addressing the unique data protection requirements of this sector.

Benefits for Resellers and Partners

White-Label and Wholesale Partners — such as Managed Services Providers, VARs and telecom providers — were also consulted during the conception of this new datacenter, as they saw value in having a site which had direct connectivity to a major communications hub. This facilitates rapid, low-latency data transfers and provide a competitive advantage in attracting SMB and Enterprise customers.

According to Storagepipe CEO Steven Rodin:

“This new datacenter launch represents an important step in our company’s evolution. We aim to provide the broadest range of services, backed by professional and attentive support and we also recognize that certain markets and industries have unmet requirements which require specialization from cloud services.”

More About Storagepipe:

Storagepipe has been a leading full-service provider of off-site backup and disaster recovery services since 2001. Storagepipe’s portfolio of online backup, business continuity, archiving and disaster recovery services are available either directly or through Storagepipe’s broad network of white label and wholesale partners.

Questions? Ask Our Experts!

How to Test Backups

Categories

Will your backup work when you need it the most? Is your backup approach seamless or disjointed? Do you test your backup methods regularly?

Testing your backups consists of more than simply recovering the previous day’s files or recovering a server to another location. If your aren’t testing your backups at least once a year, your data isn’t protected.

Things change over time. Your business will acquire new servers, launch new software, implement new processes and experience storage growth. Thus, a seamless backup process with periodic testing of your backup implementation is critical.

Maintaining a disaster plan is important. Your disaster plan should address:

  • How much data do you need to maintain?
  • How quickly do you need your files or systems back online?
  • How much critical business data can you reasonably afford to lose in the event of a major disaster?
  • When restoring old data, will you encounter compatibility problems?
  • In addition to data, do you also need to protect systems or hardware?
  • Do you have a succession plan for key IT personnel?
  • What is plan B in the event that you can’t bring systems back online quickly?
  • How much will it cost to restore systems and data back to their pre-disaster state?

Unfortunately, it’s too common for companies — that are confident in the reliability of their disaster recovery plans — to suffer critical data loss because no one ever verified that the backups could actually be recovered.

The best way to test backups is by simulating real-world data loss scenarios, and recovering as if those scenarios were actually happening in real life. These disaster recovery drills should also be scheduled at random dates and times, so that you’re forced to constantly be ready for an unexpected drill.

Keeping your backups uniform, seamless and simple will make backup and recovery easier. It will allow you to adapt to change and growth as it occurs. A single service provider that can back up your databases, email servers, flat files and PCs while also providing e-discovery, archiving, perform backup testing, mirroring and bare metal recovery is the best solution.

For more information on testing your backups watch this short video below.

Questions? Ask Our Experts!

How DRaaS (Disaster Recovery As A Service) Helps Eliminate Unplanned Downtime

Categories

Today, business no longer operates on a 9-to-5 schedule. Now, customers expect access to their accounts and resources around the clock. And employees want access to internal systems from anywhere in the world, at any time of day.

In a 9-to-5 business world, companies could afford to shut down for a day or 2 after a server crash, while systems are brought back online. But today, unplanned downtime comes with much more serious consequences, and should preferably be avoided at all costs.

Of course, it’s impossible to prevent a system outage. Software bugs, viruses, hardware problems, manual error, and even natural disasters can bring a company’s IT systems to a sudden halt. And there isn’t very much that IT administrators can do to provide 100% protection against these unexpected events.

But there are some efficient and inexpensive ways to avert the costly consequences of unplanned downtime. With a cloud-based disaster recovery solution, companies have a cost-effective way to ensure that critical systems can be brought back online in a timely manner… with little or no noticeable downtime.

Disaster-Recovery-as-a-Service, or DRaaS, allows small businesses with limited IT resources to access the same kinds of business continuity measures that are in use at major Fortune 500 companies, but without the significant investments in hardware, licensing and staffing.

DRaaS helps prevent unplanned downtime using several different approaches.

One common method relies on the use of a local backup appliance which connects to a remote cloud datacenter. Backups are performed on a frequent basis to the local backup appliance, and then these changes are transferred to the cloud datacenter.

In the event of a disaster, backups can be quickly transferred over to the main server from this local appliance. If the main server is physically destroyed or otherwise incapacitated, a temporary recovery server can quickly be mounted on the local appliance using the most appropriate backup copy. And in events — such as natural disasters — where both the primary server and the DR appliance are unavailable, then systems can be spun up in the cloud as a temporary center of operations until the primary servers can be rebuilt.

For optimal business continuity, primary servers can also be mirrored to the cloud and continuously monitored. In the event that primary systems go down, operations can quickly “fail over” to the cloud provider’s datacenter until these servers can be brought back.

For more information on Disaster Recovery as a Service, we’ve provided this short video.

Questions? Ask Our Experts!

Email Archiving to Overcome Microsoft Exchange Performance Issues

Categories

For smaller organizations that manage their own systems, email server platforms — such as Microsoft Exchange — can present some special challenges.

Long-term retention of emails is important for a number of reasons.

– First, many employees rely on their email accounts as a kind of “database” of their past activities. They need the ability to go back and search through historical conversations in order to locate information that’s immediately useful.
– But more importantly, long-term email retention is critically important for legal compliance reasons.

However, this is a problem since email data is growing exponentially. And not much can be done to stop this growth since email is the core communications medium for most business transactions today.

As hard drives get full, their speed and performance will begin to degrade. Although data continues to grow exponentially, hardware budgets are not growing along the same exponential curve.

One elegant solution is to use a “tiered storage” approach, with the help of an email archiving system.

With email archiving, older email data is periodically removed from expensive live production server disks, and copied over to an archival storage facility for long-term preservation. Since older or inactive data is rarely used, this allows system resources to be prioritized for more recent email data which is more frequently accessed.

With cloud-based email archiving, employees can still access their older archival emails when stored in the cloud. For the end-user, there is no noticeable difference. But for IT administrators, this can provide a significant boost in email server performance in a way that’s very cost-effective.

To learn more about how email archiving can help optimize performance of Microsoft Exchange servers, we’ve included this short video.

Questions? Ask Our Experts!