Click here to LEARN more.

Nov 14, 2017

Choosing a Public Cloud: Avoiding Noisy Neighbors

Paul Painter, Director, Solutions Engineering

I frequently hear people asking how to choose from among so many potential cloud options. The answer, I’m afraid, is “It depends.” Some cloud providers may over-provision their resources, causing your traffic to slow down. This drop in service is often called the “noisy neighbor” syndrome, but how can you avoid this?

Location, Location, Location

Just like choosing a house, it’s important to get a comfort level with the neighborhood. To evaluate cloud provider environments, you need to ask a few questions:

1.) What is the underlying hardware in the cloud environment?

For performance issues, you first want to confirm that the processor is equal or greater in horsepower than your computing needs. It is also important to know if hyper-threading is activated on the processor. For example, our AgileCLOUD is built using Intel e5-2650v3 processors, which have 10 cores running at 3.0 GHz per core and hyper-threading enabled, effectively doubling the processing power.

2.) What is the vCPU ratio?

In a virtual cloud environment, the hypervisor will divide the CPU cores into smaller mini-CPUs or virtual CPUs (vCPUs). Many providers will oversubscribe the available vCPUs; in other words, they will assign more vCPUs than physically available. This assumes that running virtual servers requires all the CPU cycles assigned. Cloud environments that oversubscribe are often cheaper and are ideal for workloads that are typically idle most of the time.

When a guest operating system is installed on the hypervisor, the guest instance is assigned a pool of vCPUs, as well as virtual RAM (vRAM) and disk storage to be used. The hypervisor manages the time a vCPU has access to the physical CPU in a round-robin manner, thus creating vCPU queues.

3.) What are the underlying disk configuration and network speeds?

The local disk type (SSD or HDD), number of disks and the RAID configuration will affect the performance of an I/O intensive application. AgileCLOUD uses SSD disks with RAID10, providing the fastest local disk performance possible.

Hypervisors and the Noisy Neighbor Syndrome

Just like large lots make for quiet neighborhoods, a low physical to virtual ratio makes for good computing. A cloud with a low ratio will have more access to the physical hardware, like a house with a large yard keeps noisy neighbors’ music from disturbing your dinner. A higher ratio of resources means you have less access to the actual physical hardware, which would be like your townhouse neighbor’s rock band practicing next door.

Understanding a cloud provider’s performance is more complicated than vCPU price performance.

  • Compare the processor speeds: some cloud providers have different generations on hardware with different clock speeds (and cores).
  • What is the oversubscription ration?
  • Available Disk I/O, both read/write capacity of the drive and network access if attaching to a block storage device are critical to server performance.
  • Finally, understand the physical network capacity of the server to move both LAN/WAN data and any storage access.

INAP Provides You Options

Our AgileCLOUD for example, comes in two flavors and, depending on your workload, one may be better than the other. Our Series A is specifically suited for web, application and light I/O workloads. Our Series B is better suited for applications that have higher CPU and memory demands.

Answering which of these two options is better for you depends on the workload you have, and the answer may even be both!  Let me break down the differences for you. The obvious differentiator between the two options is the hypervisor on the AgileCLOUD lineup.

To keep it simple, our AgileCLOUD offers two series of cloud compute (see table below).

Our A Series, good for small databases, websites and content management systems that require moderate CPU utilization has a 3:1 vCPU to CPU ratio.

Our B Series, better for medium databases, complex websites and scheduled batch processing tasks requiring heavy memory and CPU utilization has a one to one (1:1) ratio.

In other words, we don’t overprovision on the B series, so there is no chance of CPU contention and little chance of that noisy neighbor.

Armed with this knowledge, we believe that you will be in a better position to understand the vast array of available cloud options. If you are interested in learning more about cloud solutions that can fit into your unique cloud strategy, contact us today to speak with one of our cloud professionals or deploy your instances right away in our cloud portal.

Explore HorizonIQ
Bare Metal


About Author

Paul Painter

Director, Solutions Engineering

Read More
Oct 5, 2017

AgileMigration Part 1


Discovering the Monster

If you ever have to manage an IT system migration, you will have plenty of potential conflicts. Based on what I have experienced in over a decade in the field, you may encounter something like this:

You accept a job at a great company managing a pretty sizeable IT environment. You’ve been made aware that stability and cost control will be your primary concerns. This includes moving the majority, if not all, of your infrastructure to “The Cloud” in the next 6 months!

You arrive enthusiastic, but, after a major outage in your first week, you try to triage with a root cause analysis. Each department in IT can only give an account of a few specific systems, so you ask for access to monitoring systems and to be added as a recipient of all critical alerts to get an overhead view.

That’s when you find out that such a system doesn’t exist in the department. You ask why, and one of your employees explains, “Johnny DBA doesn’t want us to monitor his program because he says he owns it.” Suddenly you realize why stability and cost control are your navigational goals: territorial thinking and departmental silos have led to a dysfunctional organization.

You ask your team for an inventory of all the systems, and over the next few weeks you dig further, uncovering more and more gaps in the inventory and even a feeble attempt at a coup, which results in the termination of one of the most tenured system admins. With his termination, knowledge of the oldest systems still clinging to the infrastructure leaves with him.

It’s at about this point that you realize you’ve inherited a monster:

  • Outdated servers running critical systems
  • Questionable backup processes
  • A staff with entrenched territorial thinking

I call this the Frankenstein of IT, and I learned early on in my career that only a formal inventory could protect my IT environment from the monster.

Taming the Monster

Introducing INAP’s AgileMigration Service

As the leader of your IT organization, the need for accurate and detailed reports is critical to your success. This includes things like:

  • The physical infrastructure – What hardware and how old
  • The application stack including versions, service packs and patches
  • Resource utilization – What is assigned to the host or guest v. how much is really needed

To migrate any environment, then, you want to call out gaps and issues as quickly as possible so that you can set real expectations. Performing a data audit will help you determine what budget is likely needed, who you will need to manage, and how to migrate the environment. You need to understand not only the infrastructure being moved, but also the inter-application dependencies and affinities. In other words, you need to know how any system works with each other system for any given report or service.

In talking with our customers, we found that they struggle to get this information in a timely manner. This was the primary driver for launching AgileMigration, a comprehensive white glove migration service.

The AgileMigration solution is comprised of three distinct phases: Map, Manage, and Migrate. In the Map phase, our noninvasive technology collects a complete inventory in the environment, including the application workloads and dependent systems across the network. In the Manage phase, detailed infrastructure inventory reports compiled from the data create a clear plan to migrate. Finally, our technology will Migrate your entire environment from your existing platform or cloud provider to new environments with minimal or no downtime.



Discovery is the First Step

Whether you are looking to use INAP services or you have no intention to move but you need assistance auditing your environment, we can help. The AgileMigration service is made up of discovery tools, migration tools and professional services, each with a unique role to play.

For discovery, we will provide you with a physical or virtual appliance to set up in your environment to collect the details you need. We typically like to run the collection for a period of no less than two weeks, but we recommend at least a month to capture any end-of-month activity.

Our Agentless Deployment provides you with the following benefits:

  • No Software Prerequisites or server reboots
  • Lightweight, Quick implementation
  • No port scanning or packet interrogation
  • Affinity mapping
  • Discovers all equipment (servers, networking, security appliances and storage appliances)

To get the most from your discovery, Customized Detailed Reporting allows you to configure the information you need in exportable spreadsheets.

Banishing the Monster

Our AgileMigration service will lower your project costs by reducing personnel hours associated with manual discovery efforts and eliminating challenges associated with subjective data. More importantly, you will be able to keep skilled IT staff on projects that add value to your organization’s mission. Finally, our discovery provides a “source of truth”: using hard data for planning will help break down silos.

Once you have completed Discovery, you have several options for how to use the information. In the next blog post, I will detail how INAP can help your company through the Mapping and Migration phases.

Explore HorizonIQ
Bare Metal


About Author


Read More
Nov 26, 2013

Internap’s acquisition of iWeb is officially closed!

Ansley Kilgore

Eric_Christian_hockey_jerseyWith the official business of the acquisition behind us, Internap and iWeb are ready to embark on our journey together as one of the top pure-play IT infrastructure services providers. We are proud to join forces with our new iWeb team members from around the world, and can’t wait to see what our combined army of hosting and cloud experts can achieve.

With our shared focus on meeting customer needs at every stage of the business life cycle, along with our shared passion for OpenStack, our consolidated team is certainly far greater than the sum of its parts.

Since day one, though, Internap’s relationship with iWeb has been about more than just financial statements and strategic business goals. Every interaction between our teams has confirmed that our ideas, philosophies, and personal synergies are perfectly harmonized with each other, and we couldn’t be happier with the combined team that we have created.

We’d like to extend a heartfelt ‘thank you’ to Christian and the entire iWeb team for your willingness to “pass the puck” to us. We look forward to learning from each other and writing the next chapter of Internap and iWeb history together.

Who knows, we may even learn a thing or two about hockey.

Explore HorizonIQ
Bare Metal


About Author

Ansley Kilgore

Read More
Sep 27, 2013

Webinar recap: The nuts and bolts of bare-metal clouds

Ansley Kilgore

nuts and bolts of bare-metal cloudsBare-metal cloud is emerging as the next evolution of cloud computing. More workloads and use cases are demanding high performance processing power, and in many cases these requirements are best served by dedicated, physical infrastructure. Bare-metal cloud offers a cost-effective way to get full server performance with no virtualization penalty.

Recently, Internap and GigaOM Research discussed the nuts and bolts of bare-metal clouds (watch the webinar here). As enterprises continue to embrace the cloud, bare metal is quickly becoming an economical way to meet high performance demands, making it an essential piece in the future of cloud solutions.

Price for performance – The bare-metal cloud usage-based pricing model and automation capabilities offer upfront and long-term savings, but with substantial performance gains when compared to traditional public clouds. For workloads or databases with a low tolerance for noisy neighbors and the oversubscription found in multi-tenant environments, the bare-metal alternative gives you the full compute and processing power of a dedicated server. The bare-metal cloud is natural choice for performance-sensitive use cases such as big data and media encoding.

Hybridization – For organizations just starting out, the ability to use bare-metal cloud together with virtualized resources is a key driver. This approach lets you get the most from your existing infrastructure while seamlessly adding new performance capabilities. Connecting your cloud environment with other services such as colocation offers a higher degree of control and flexibility that physical infrastructure alone cannot provide. Hybridization also allows you to use the cloud for bursting needs, which is a model used by many Internap customers.

Innovation – Moving forward, IT departments will need the scalability, storage and security advantages of bare-metal cloud in order to meet requirements from the lines of business. As more cloud providers start offering bare metal capabilities to meet this demand, customers should expect further innovations around automation and flexibility. Within the next few years, bare metal will become mainstream and concerns about data control and compliance will no longer justify buying your own hardware. Many companies have already incorporated bare metal into their cloud strategy.

As more enterprises move workloads into the cloud, bare metal should be considered part of a successful IT Infrastructure strategy. Is your current cloud service meeting your requirements from a performance, security and cost standpoint?

Learn more about the Nuts and Bolts of Bare-Metal Clouds.

Explore HorizonIQ
Bare Metal


About Author

Ansley Kilgore

Read More
Jul 2, 2013

Three keys to success for online game developers

Ansley Kilgore

success for online game developersFor game developers and publishers, launching a new game into the market and creating loyal fans takes a lot of work. Whether you are an established gaming company or a new publisher entering the highly-competitive marketplace, having the right IT Infrastructure in place is critical. Your game must deliver the availability, performance and scale that online gamers have come to expect on their digital quests. Internap solutions help set game developers up for success by providing the right environment for testing, development and deployment of online games.

High-performance cloud services
When the Massachusetts Digital Games Institute (MassDigi) needed a server to host their new online game, Internap’s AgileCLOUD provided a cost-effective way to spin up virtual servers that could support their development and collaboration needs. MassDigi was able to continuously test their online game with live users, and scale dynamically up to thousands of players as needed. With cloud hosting solutions, game developers have more flexibility to test, develop and deploy without worrying about the limitations of technology.

When introducing new games into the market, speed and performance are key aspects of high-quality game delivery. No matter how awesome your game may be, users will abandon it if there is too much latency. With route-optimized Performance IPTM, Infinite Game Publishing (IGP) can provide gamers with a flawless online experience during the initial game launch and beyond. Meeting the expectations of online gamers is the first step to creating a loyal customer base. This is especially important in the free-to-play revenue model, which is less predictable than a subscription-based model.

Hybrid infrastructure
Gaming companies that use Internap services have the ability to mix and match different infrastructure offerings, including public and private cloud, bare-metal, managed hosting and colocation. As the gaming industry continues to grow and more businesses enter the market, the successful gaming publishers will be those who can seamlessly deliver their game to end users with low latency, high availability and high performance. The ability to establish and maintain your competitive edge depends on having the right gaming infrastructure in place.

Explore HorizonIQ
Bare Metal


About Author

Ansley Kilgore

Read More
Jun 18, 2013

Industry news: cloud and colocation services offer data protection and security

Ansley Kilgore

Whether you’re trying to protect your data from a natural disaster, or concerned about meeting compliance or regulatory requirements for data storage, colocation and cloud services can provide peace of mind for your business. While there are many reasons to take advantage of colocation services, improved security and data protection can be some of the most important.

Below is a collection of articles to provide insight into the security capabilities of colocation and cloud services.

NIST releases cloud security documentation
The National Institute of Standards and Technology recently revealed a new standard document designed to accelerate cloud adoption in government settings. The guidelines are focused on helping public sector organizations establish cloud computing use models that are secure enough to meet stringent government requirements.

Colocation hosting offers value as a data protection strategy
Colocation providers offer remote storage, network security, firewalls and physical protections for your data. Access to your servers can be better controlled in a secure data center than in most office buildings. Colocation services can also help control access to your internal networks, making sure that only those who are authorized can access confidential data and company information.

Active Hurricane season predicted — colocation can be an asset
Colocation providers that offer complete infrastructure redundancy help minimize the risk of data loss in the event of a disaster. Data centers with reliable N+1 design and concurrent maintainability, such as Internap’s New York Metro data center, can protect against outages. When evaluating data center providers, make sure their facilities have the right infrastructure design and preventative maintenance in place so that your data and equipment are protected if disaster strikes. Since natural disasters are unpredictable, no data center can guarantee that your servers won’t go down, but data center facilities and colocation services can be a key part of your disaster recovery strategy.

Keep security in mind when choosing a cloud provider
The public cloud created security concerns initially, because multiple organizations were sharing resources from the same cluster of servers. If one company experienced a breach within its virtual machine, it was possible that other organizations sharing the same resources could also be affected. However, with dedicated private cloud options and secure networks, cloud providers can successfully protect data and avert threats.

For up-to-date information on IT industry news and trends, check out Internap’s Industry News section.

Explore HorizonIQ
Bare Metal


About Author

Ansley Kilgore

Read More
Apr 4, 2013

Back to GDC: Trends in the online gaming industry


stock-mobile-gaming_devices_300x150We were back at the Game Developers Conference in San Francisco last week! It was an interesting and educative journey; GDC is a great opportunity to experience the many facets of the gaming world, from developers and publishers with big household names to indie developers and the different tools that enable them to enter the market. Throughout the sessions and while walking and talking on the expo floor, there was a lot to be experienced, including big reveals like Battlefield 4 and the impressive Unreal Engine 4 demo.

I noticed a few big takeaways:

CDN is still king
In our current era of patching and DLC (downloadable content), having a reliable and efficient content delivery network (CDN) has become a necessity due to the frequency and large file size needed for patching. Since delivery via a CDN is at least 5x faster than no cache and origin delivery, this is a no brainer for the gaming industry. One can’t expect to have a newly-released patch take too long or keep a significant portion of your players out because they don’t have the latest version without bracing for significant backlash.

This was very evident after attending Arenanet’s session about their successful MMO Guild Wars 2 (MMOs are notorious for their reliance on patching both to apply critical fixes and to add new content). Arenanet needed a custom CDN solution (something our CDN ops team regularly provides to gaming customers) to ease their patching woes and enable players to obtain the latest patch from either Arenanet or the CDN as soon as the patch went live.

Time for indie?
Even though the rise of the indie developer has been a long time coming, this year’s GDC heavily showcased a number of new SDKs (software development kits) specifically tailored to lower the barrier of entry for them. These tools were focused mostly on HTML5 and JavaScript, allowing developers to easily import their web-ready content or games to new platforms.

Also interesting is that this new direction is apparent on consoles, too. Nintendo showcased their Web Development kit that allows developers to transfer content to the Wii U, which isn’t surprising given the rise of sales in the online store for each of the three consoles. Another development in this area was the strong showing of the Unity game engine, given their Indie support community and lower cost (free for the basic version). What does this all mean? There’s already an increased demand for hosting services across the industry, and these tools help not only to continue but to increase that trend.

What about cloud?
There was a lot of talk about cloud as well, from recent offerings to new improvements and updates from cloud providers. One recurring theme, however, was that developers were looking for smoother and easier ways to spin up cloud instances globally depending on peak game times. We agree that automation is key for the gaming vertical – rapid game development cycles and notoriously fickle players demand infrastructure agility. Our Hosting API allows developers to spin up instances as peak times occur throughout the day and through different geographical regions, and our monitoring tools allow developers to keep track of when and where new instances should be spun up.

See you next time at GDC! In the meantime, check out our online gaming success kit.

Explore HorizonIQ
Bare Metal


About Author


Read More
Feb 20, 2013

Choosing the right provider can alleviate cloud security concerns

Ansley Kilgore

 cloud security concernsCloud hosting solutions and other cloud offerings provide organizations with a powerful, cost-effective way to improve operations. However, cloud computing is a relatively young technology that many businesses are still trying to figure out. This has led to many security concerns because issues of control, data ownership and availability come into play. While these concerns are limited, all of them can be dealt with by choosing the right service provider to partner with.

Understanding cloud security issues
The core security issue in cloud environments stems from control. The data is placed in the hands of a third-party vendor, putting organizations in a position where they have to iron out the details of who owns the data at various phases of the partnership. Boundaries are unclear across the industry, creating problems with control and availability if the partnership is not effective.

The primary issue at the center of this is the newness of cloud technology. To a great extent, companies are still searching for the line between their responsibilities and those of the cloud provider. Inconsistent standards across the industry have led to many of the cloud security concerns, but individual providers can overcome these problems and protect data in the cloud more effectively than most businesses can do on their own.

Improving cloud security through an effective partnership
Collaborating with a cloud vendor is the first thing businesses need to do if they want to secure cloud assets effectively. Understanding everything from how the virtual servers will be established to data flows through the network can give IT departments a better idea of how the system will work and when the security responsibility falls on them. This communication can also allow both parties to establish clear lines of demarcation identifying data ownership, availability and control standards within the cloud deal.

Developing this type of collaborative partnership can be difficult. Working with a provider that is flexible and capable of offering a powerful security foundation is key when striving for such a relationship. If the vendor has a good security program and a clear Service Level Agreement (SLA), both parties will understand which responsibilities fall to the client and which ones stay with the provider. This knowledge can act as the starting point in building the type of working environment that overcomes many cloud security issues.

Recently, 54% of companies identified security as their main concern for transitioning to the cloud. Check out our infographic, Cloud Security: Perception vs. Reality, to learn how the right provider can alleviate your concerns and provide a secure environment for your data.

Explore HorizonIQ
Bare Metal


About Author

Ansley Kilgore

Read More