When I say POODLE, what do you think of? Is it a fluffy dog? In most cases, I would be referring to the fluffy dog, but for this article, we will be focusing on a security vulnerability. I’m not sure if you’re aware, but if you’re currently using SSL version 3.0, you will need to perform some updates to your SSL daemon on your server. SSL stands for Secure Sockets layer. A SSL is what every ecommerce site should have. It allows for you to securely process payments through your website. In fact, if you’re taking orders from your clients, you should be using a SSL. SSL’s add another layer of security and trust for your clients. If you’ve not read my post on PCI compliance and you’re running an ecommerce site, you should read my post on PCI compliance here: (Insert link to PCI compliance post)
With SSL’s as with any piece of software on the internet, there are different versions. SSL version 3.0 is nearly 18 years, however, SSL version 3.0 is no longer secure and remains in widespread use across the internet. Nearly all browsers support SSL version 3, and in order to work around bugs, within HTTPS servers, browsers will retry failed connections with older protocol versions, including SSL 3.0. This retrying of failed connections within SSL v3, allows the POODLE exploit to be initiated. This POODLE exploit works due to the nature of the failed connections and allows for a possible leak of your customers data when processing orders. You can read more about the specifics of the attack here:
Browsers and websites should turn off SSLv3 in order avoid compromising users’ private data. The most straight forward method is to disable SSL 3.0 entirely, which you can see how to do at the links below, however, this can cause a myriad of computability issues. Therefore, the recommend plan of option is to enable TLS_FALLBACK_SCSV. Using the links below, they will show you how to properly secure your servers SSL daemon. These options resolve the issue of retrying failed SSL connections. It also prevents hackers with knowhow from downgrading from TLS 1.2 to 1.1 or 1.0.
For WHM/cPanel servers – https://documentation.cpanel.net/display/CKB/How+to+Adjust+Cipher+Protocols
For DirectAdmin servers – http://forum.directadmin.com/showthread.php?t=50105
For Plesk servers – http://kb.sp.parallels.com/en/123160
‘In computing, a firewall is a network security system that controls the
incoming and outgoing network traffic based on applied rule set. A firewall establishes a barrier between a trusted, secure internal network and another network (e.g., the Internet) that is assumed not to be secure and trusted.’
As avid readers of the blog know, I like to ground these ideas with every day analogies. You can think of a firewall like a door to your home. When the door is opened, people can walk directly into your house. Should you want to keep people out, you close and lock the door. This is the way a firewall works on a server. You place the firewall onto your server to keep intruders from the internet from accessing your data.
Firewalls can be either hardware or software based. If you go with a hardware based firewall, the firewall is connected to your switch that allows for traffic to be filtered upon a rule set you determine. You would use a hardware based firewall if you had a dedicated server. A software based firewall is installed within your server. It still blocks traffic based off rule sets you create, but it just does it from within the server and not out in front like a hardware based firewall.
For the rest of this article, I will provide you the steps to install CSF, which is short for ConfigServer Security and Firewall. This firewall is supported across many different Operating Sytems, RedHat Enterprise, Centos, CloudLinx, Fedore, Virtuozzo, VMWare, to name a few. You can read more about the supporeted systems here: http://configserver.com/cp/csf.html
This firewall can be installed with the following steps on your Linux based server:
mkdir /usr/local/src <– Creates the directory to install CSF
cd /usr/local/src <– Changes your location on the server to the newly created directory
wget http://www.configserver.com/free/csf.tgz <– downloads the CSF software to your server
tar xfz csf.tgz <– Extracts the software
cd csf <– Changes your location on the server to the CSF directory
./install.sh <– Installs the CSF firewall
CSF, when installed, and configured properly, places a preset list of rules onto your server. These rules can be configured directly within the csf.conf file or the csf configuration file. If you have a cPanel based server, you want to ensure that you have the following ports opened for inbound and outbound:
# Allow incoming TCP ports
TCP_IN = “20,21,22,25,53,80,143,443,465,587,993,995,2078,2082,2083,2086,2087,2095,2096”
# Allow outgoing TCP ports
TCP_OUT =”20,21,22,25,37,43,53,80,110,113,443,465,587,873,995,1167,2086,2087,2089 ”
Those ports cover most of the ports you will need for your cPanel or non-cPanel server to function. You can read more about ports and their functions here: http://en.wikipedia.org/wiki/List_of_TCP_and_UDP_port_numbers
Once you do that, you may want to limit the amount of connections each user can make to your server. This can be set by changing CT_Limit in your csf.conf to the number of connections you want each user to be able to make. For example, CT_Limit = “150” will only allow each user to make 150 connections to your server.
You may also want to remove port 22 from TCP_IN along with setting your SSHD_config file to do only public_key authentication. Why would you do this? This will lock down your server from the outside and only allow people who have SSH keys installed into your server to gain access using SSH.
CSF can be configured in a multitude of ways to add another layer of security to your server. I highly recommend going to http://configserver.com/cp/csf.html and using the forums to learn more about the many features of CSF and how tweaking the settings can help ensure you’re providing a stable, safe and secure server environment
If you’ve been a follower of the blog, you know that I’ve written a post on the importance of having backups of your data. I compared having a backup solution to having insurance on your automobile. This post was a generalized approach to backup solutions. For this week’s post, I will delve deeper into the realm of backups. More specifically, we will discuss the different types of server backup options that currently exist. This post will be of a more technical nature then my previous posts, but I assure you, if you stay for the entire post, you will have a better idea of server backups and the myriad of options that are available to you.
For more info and to setup cloud backups for your server, visit http://www.turnkeyvault.com
Shall we begin? There are a few different methods that exist for creating server backups:
- Bare metal backup/restore
- Cloud backups
- Virtual server backups
I will go through of each these methods to give you an inside look into each option. Let’s dive right in with bare metal backups and restores.
Bare Metal backup/restore
In disaster recovery, a bare metal restore is the process of reformatting a computer from scratch after a catastrophic failure. This process entails reinstalling the operating systems, applications and if possible, restoring data and all settings. Bare metal restores allows you to restore to an uncofingured server as the backup includes all information to setup the machine and move the data over. This results in a ready to go backup server.
At a deeper level, bare metal backup/restores work by taking a “snapshot” of the server. This snapshot includes every file and folder that exists on the server including all hidden files and directories. This snapshot is then pushed to an offsite location where the entire image can be deployed at a moment’s notice. If you have a Windows server or even a Linux server, bare metal restores will copy the entire operating system structure. Usually, these backup images are the rather large as they are an exact replica of the your running server.
For example, let’s say that you have a full power outage at your company. Upon the power returning, you realize that your main hosting server has lost all data. It can’t find the boot record to load the operating system and all files have been removed. Since you’ve purchase a bare metal backup solution, which you can view our current offers here: http://turnkeyvault.com/server_backups.php , you simply login to your bare metal software. You then select the server you want to restore and viola. The operating system is re-installed with all applications. It’s as if you’ve never had the major system failure
When I say cloud backup, what immediately comes to mind? I personally imagine a white, puffy cloud in the sky that resembles a vault. Was that what came to mind for you? If not, that’s quite all right. A cloud backup is a piece of software that takes a snapshot of your server and then stores the backup in the cloud. What exactly do I mean by the cloud? The cloud is a piece of software that is stored off-site that can be accessed from any location. Cloud backups allow for greater flexibitily then a local disk or tape backup. A disk backup or tape backup has the limitation of only being able to access the data locally. This could mean data is being stored on a different server that is stored in your local office. In order to access the backup, you would have to drive into your office, connect the two servers and then migrate the data over.
Do you already see the disadvantage to this type of local system? What if you’re traveling and have a disaster and need to restore your data? How will you do it if your business only keeps local backups? This is where a cloud backup comes into play. Since the backup is stored offsite and can be access via an internet connection, you can restore your data from anywhere in the world. This allows for greater flexibility in your backup solution. Also, another disadvantage to local backups is the size or space requirements for the backups. Say you have 1TB of data you need backed up, but you only have 500GB worth of space. What will you do? More than likely, you would just add a new device to your backup software. This may be an additional hard drive, a USB drive or maybe a network attached storage.
Well with a cloud backup, depending on your vendor, you can usually just increase the resources of the cloud storage to accommodate your increasing space needs. This allows for you to be able to rapidily add more space to your backup server to accomdate your increasing data space requirements. Now, in no way am I advocating that you should remove your local backup options, but instead add another layer of backups to your current system such as a cloud backup. Having local backups and cloud backups are a GREAT way to maintain business continuity.
Virtual server backups
Virtualization is one of the best things ever done for servers, as it allows for one physical server to act as several servers. This dramatically reduces computing costs and boosts efficiency. One of the main challenges with the backup of virtulized servers is the backup of the virtual servers data and the main hostnode data. When I say, hostnode, I’m referring to the original server that contains all of the virtualized servers. The reason you need to keep backups for both the host and the virtual servers can best be summed up with an example.
Your business has decided to virtualize all of the servers in your office. Your IT department recommends going with VMware. Fast forward a few months and you have a major system failure within the host node. Your main hard drive dies and you lose all virtual servers that were stored on the server. Luckily, you have a backup of the hostnode and just restore the backup for the hostnode, however, upon checking the server, you notice an error. Sure, your main host node sytem files were restored, but all your virtual servers data is missing.
This example illustratres the need to have a backup of the physical hostnode and the virtual servers. The physical hostnode contains the system files that VMWare needs to run. The virtual servers would also need a backup to restore the user data that has been created in the virtualized server. Usually the virtualized servers have a different type of operating system then a normal, non virtualized server would contain. You would need some virtualized server backup software that can handle creating backups of the virtualized servers as well as, the main host node itself.
You could have local backups of both the hostnode and the virtualized server that you can restore. You could go the bare metal route for the host node as well as virtualized servers or even the cloud backup method. It’s just important that you have backups of both the node and the virtual servers.
For more info and to setup cloud backups for your server, visit http://www.turnkeyvault.com
Hopefully after reading this post, you feel a bit better about the different backup options that exist and can come up with a backup solution that fits your company needs.
Griffin is from Williamsport, Pennsylvania where he joined the United States Army as a Signal Support Specialist. He learned many aspects of the Technology field during his six years of service. With two tours overseas, Griffin was able to perfect his craft in installations and in radio communications. Upon returning home, Griffin went to college to learn more about technology. He recently graduated with his first degree in Mobile Applications Development, and did not stop there. He is currently finishing from his second degree in Networking.
I wanted to learn as much as I could possibly learn, but the only way to gain a craft such as this is hands on experience. I was given a flyer for a job at Turnkey Internet. I was very excited to get an opportunity to work for the company. There is so much knowledge within Turnkey Internet. I love the people that I work with. It is a very personal work experience, and I have been able to learn so much from the employees. I did not go out and find myself a job, but instead I found myself a home with Turnkey Internet.
Randi comes to us from the snow making capital of the world, Hunter Mountain, NY. Randi has always had a strong fascination with electronics and how everything works. What makes it go from point A to point B. What happens if we modify the algorithm?
Unlike most children, Randi’s least favorite Holiday growing up was Christmas. Randi’s brother always received electronic toys. Powerized car and trucks, flying helicopters, a hover board. Randi was left with Barbies and play sets.
Randi’s brother would get bored with the electronic toy and go play with another. At this point, Randi would take the toy into her room to try and figure out how it worked. The toys always ended up in a hundred pieces on her floor. She had to figure out how the toy worked. Never being able to put it back together, she was grounded by dinner each year. Randi didn’t want to be a doctor, no need to take Barbie apart.
Knowing Randi’s strong desire to understand how things work and her love for electronics, it was only natural her parents pushed her to attend a technical school. Randi studied at the Withlacoochee Technical Institute in Florida. Then continued on to attend Mildred Elley and the Sage Colleges of Albany her in NY. Randi graduated at the top of her class with a degree in information technology.
Did you know in August 2014, Google announced that HTTPS would become a ranking credential? This is news worthy of mention and note as Google rarely reveals ranking criteria – and more so confirms that the SEO optimized hosting at TurnKey in fact does raise your rankings for SEO systems (at least with google, now confirmed).
You can read more about Google’s comments here: HTTPS as a ranking signal here. [PDF ]
HTTPS is an added layer of encryption that Secure Socket Layers on top of HTTP or web traffic. This adds additional security to standard HTTP communications or web communications. SSL certificates are required for
e-commerce sites especially if you’re desiring your site to be PCI compliant. You can see my post on PCI compliance here: https://blog.turnkeyinternet.net/web_hosting/pci-dss-compliance-in-the-cloud-for-web-sites-servers-and-colocation/ . Having an SSL certificate is essential on an ecommerce site because of the secure transmission of sensitive information like credit card numbers, personal information, and login accounts.
If your website or blog begins with https://, you have likely received an uptick in Google’s rankings. This is currently a lightweight signal meaning that it doesn’t affect your site rankings greatly, but experts believe it will become stronger in the near future.
To turbo charge your web site’s rankings, be sure to use SSL certificates with a dedicated ip address on your web site, TurnKey offers an all-in one ‘turnkey’ solution to this to help you increase your search engine ranking with our Turbo SEO cPanel Web Hosting that bundles in multiple dedicated class-c ip’s and ssl certificates for one low cost in a simple to use interface.
Do You Need an SSL Certificate for Your Website?
REQUIRED: All websites should have some form of protection on them. This form of protection can come in many forms, however, if you’re going to be taking any type of data from your customers such as credit cards, phone
numbers, emails, or any personal information, you need to ensure that the data is transferred securely. SSL’s remain one of the most robust ways to do this.
As an online merchant, it’s your responsibility to make your customers’ private information is secure. If you are storing credit card information in a database on your website so you can manually charge it later, then you need an SSL certificate to secure the credit card data stored on your server. If you have any sort of log-in form where customers enter a username and password, on top of sanitizing the input from the user, a SSL certificate is highly recommended.
NOT REQUIRED: An SSL certificate is optional if you don’t gather personal information and instead forward your customers to a 3rd party payment processor like PayPal. This can be done as simply as embedding a PayPal button to your website. PayPal uses their own certificate to encrypt customers transactions. HOWEVER, you can still benefit with SSL for search engine rankings, so it’s worth the investment but not REQUIRED for this category.
What Webmasters Should Do Now?
Decide the kind of certificate you need: single, multi-domain, or wildcard certificate. (More on this in a minute.)
Use 2048-bit key certificates. Use relative URLs for resources that reside on the same secure domain. Use protocol relative URLs for all other domains. Don’t block your HTTPS site from crawling using robots.txt. Allow indexing of your pages by search engines where possible by avoid the noindex robots meta tag.
Purchase an SSL Certificate from TurnkeySSL.com
Turnkey Internet is a trusted reseller of GlobalSign SSLs. The GlobalSign SSL certificates includes domain validation, quick issuance, re-issues among many other options such as adding a full trusted bar in your browser that allows visitors to see your SSL is trusted across the web.
Turnkey Internet has multiple types of SSL certificates for secure communication with business, system, portals, mail and more.
Our TurnKeySSL alpha certificate is ideal for small business, blogs, and personal websites which costs $29 per year.
The TurnkeySSL Professional certificate is ideal if you wish to have multiple subdomains covered (example: corp.yourdomain.com and web.yourdomain.com). This Pro level SSL certificate has full organization vetting which provides higher levels of trust and includes a malware site scan service. This is also preferred for service providers and SEO companies. $150 per year.
Lastly, TurnKeySSL Extended Validation (EV) Certificates are the most secure and offer visitors the green bar and enhanced sales. It also includes malware site scan service. The green address bar that comes with a TurnkeySSL Extended certificate prominently displays your company name, providing immediate trust and improving customer conversions. This certificate is $899 per year.
Keeping your certificate always up to date is recommended as you never want your clients receiving any SSL warnings when purchasing a product from you. Ideally,you would set the certificate to auto-renew annually. You can always check the expiration date by clicking the padlock symbol and then “View Certificate”. Test your entire checkout process in Firefox, Google Chrome, and yes, even Internet Explorer.
Once again, you can go directly to turnkeyssl.com to purchase any of the mentioned SSL types above
To turbo charge your web site’s rankings, be sure to use SSL certificates with a dedicated ip address on your web site, TurnKey offers an all-in one ‘turnkey’ solution to this to help you increase your search engine ranking with our Turbo SEO cPanel Web Hosting that bundles in multiple dedicated class-c ip’s and ssl certificates for one low cost in a simple to use interface. Learn more
Until next time…
If you haven’t read Part I you can do so at https://blog.turnkeyinternet.net/ask-the-expert/net-neutrality-if-it-isnt-broken-dont-fix-it/
Let’s touch on traffic prioritization. What’s one of the easiest things to delay in order to prioritize other traffic? Email. After all, email doesn’t need to stream all at once like a song or a movie so it seems like the easiest thing to delay while prioritizing other traffic. Now, suppose that email is queued up all day long until, say, 3am in the morning when traffic flow is at its lowest. At 3am, these email servers begin sending all the emails they’ve queued up all day long. No problem, right? Heck, it’s only email. Hmm, suppose you emailed that signed contract for the proposal and it had to be accepted by 5pm the day before but your email got queued up in order to make way for traffic that had been prioritized over the delivery of your email. Are you beginning to see where this is going?
People have been using applications like twitter to instantly reach multiple people at the same time. Suppose a town is under siege by a group that is trying to take over the town. Suppose those town folks are alerting other family members and friends about the invading troops’ whereabouts so they can keep their loved ones out of harm’s way. For the sake of argument, let’s say that twitter does not pay traffic carriers extra money to prioritize their traffic. What happens now? If your home is about to be invaded in five minutes and the message warning you of the impending invasion is not sent to you until 15 or 20 minutes later – it’s too late. I know this is an extreme example but prioritizing traffic could actually become a matter of life and death.
President Obama embraced, almost exactly, the comments I posted on the FCC website concerning this matter. In fact, what he is proposing is so close to my comments that I’m sure they must have been passed along to him. In short, my comments were to keep internet traffic neutral – nobody’s traffic gets prioritized. If you are a content provider and you want your traffic prioritized, then setup your own network and allow people to buy bandwidth directly from you and only your content is delivered over your private bandwidth. The issue is though, it’s not the content providers who are causing the ruckus, it is the traffic carriers who have brought this whole issue about.
Where President Obama strayed from my comments was in recommending that the internet become regulated as a utility such as the phone companies. Let me just say this… “Dear God, save us all.” I used to own a VoIP company and the myriad of taxes, fees and surcharges on phone service staggers the mind. Do NOT let this happen to internet services. Basically 31%-35% of your phone bill is comprised of taxes, fees and surcharges. The FCC says that it wants to regulate Internet companies under Title II so they can control it. I guess they (the FCC) hasn’t figured out that they already regulate the internet and VoIP – no Title II regulations are needed.
I say that the content providers should not be charged just because people want to download data from them. The charge for providing end users for the bandwidth they need, should come from the traffic carrier. This would be a good thing because once the price of a traffic carrier gets too high, someone will step in with a less costly way to provide bandwidth. It’s called good old fashioned competition. The traffic carriers want to make the content providers look like the bad guys. And, as if that isn’t bad enough, the traffic carriers are already double-dipping on the profits. Anyone who is younger than 35 or perhaps 40, may not remember the days of completely free TV. TV was free because the networks made their money from the advertising space that they sold. Then along comes cable companies that charge end users a monthly fee for TV shows AND they are collecting all the revenues from the advertisers – man, talk about a cash cow – they’re getting paid on both sides and now they want to charge content providers too????? Are you serious?? And remember, the content providers are already paying their hosting company for all the bandwidth they need. Why should they have to pay again? The traffic carriers have oversubscribed their networks and that is a problem solely created and owned by the traffic carriers, and the content providers should not be held responsible for the traffic carriers problem. Let me offer an analogy . Let’s use a different scenario where there are three parties in the same roles:
- City water department – aka content provider (they provide water)
- Building contractor who builds houses on a huge tract of land she/he owns – aka traffic carrier
- End user – you and me who have bought homes from the building contractor
We purchased our homes and the contractor guarantees us 5 gallons per minute of water flow. The contractor runs a six-inch main to serve the housing development. The building contractor builds just enough homes so that the water flow to each house is 5 gallons per minute. The problem is, the building contractor wants to make more money and so they overbuild the development and the water flow is now reduced to 2 gallons per minute just because there are too many homes using water at any given time. So, the building contractor turns around and decides to charge the city water department a fee to prioritize water flow to certain homes, which of course will reduce water flow even further to homes that do not receive prioritized water flow. The problem is not the city water department. The city water department has more than enough capacity to serve all the homes in the development with 5 gallons per minute. The problem is that the building contractor only put a six-inch main in and what is needed is a twentyfour-inch main. This is not a home owner created problem nor is it a city water department created problem, the problem is the building contractor did not put in a large enough main pipe to feed all the customers. And that is exactly what is happening with the flow of traffic on the internet. The traffic carriers (building contractor) do not have the capacity to give the end users (home owners) all the bandwidth (water flow) that they guaranteed and the traffic carriers (building contractors) are turning around and charging the content providers (city water departments) a fee to prioritize their content (water flow). The building contractors (traffic carriers) want to make the reduced water flow (bandwidth) appear to be a problem caused by the city water department (content provider). And that is just not the case.
A peripheral but related issue here is the maddening amount of video and audio content that is displayed on websites. How many times have you gone to a website and an advertisement begins to automatically play? Also, how many times do you visit a site and want to read the story but there is no text, just a video? That drives me nuts. All I want to do is read the story, I do not want to watch a video. Sorry, I’m going off on a tangent.
- Leave the internet as it is
- Do not regulate it like telephone companies are regulated unless you want to see prices increase due to fees, taxes and surcharges
- Leave the FCC in charge of the internet and VoIP
End of story, it truly IS that simple.
Earlier on the blog, I wrote to you about having backup software. I compared having backup software to having car insurance. You never know you need it until you actually need it. Does that make sense? I hope that last line wasn’t too confusing. Well, I have another question for you to start this article.
Do you currently have any monitoring software for your server?
Now, depending on where you host your website or rent your server from, the host may provide a basic type of monitoring software. For example, if you purchased a dedicated server, Virtual Private Server(VPS), Cloud Server etc. from us, Turnkey Internet, your server will be automatically be setup on a basic ping monitoring software. This works off using ICMP which is a basic protocol used across the industry to monitor servers. I won’t get too off-base with this post, so you can read more about ICMP at the link below:
Now, you may be asking, “what if I bought a reseller or basic hosting account? Is that only ping monitored?” In our system, all our reseller and hosting servers have another level of monitoring attached to them. This includes ping monitoring, memory monitoring, drive space monitoring, snmp monitoring, and bandwidth monitoring to just name a few. We also can setup content checks. That means we can setup a monitor that will look if a site contains a word or piece of text. If it doesn’t find the word, the server will alarm for us.
You may be asking why? Why write an article on monitoring software? Well my friends, in slaying tickets each week, I come across many different issues across different clients. Some of these issues could have been prevented and others would have had a smaller impact if preventative measures were taken. Let me give you an example to really drive this one home.
Let’s say you have a website named, jeremysdomain.com and purchased it directly from Turnkey Internet with a dedicated server. Your site will be used to as a life force for your business. You will take orders online. You place promotions online among other items. Next thing you know, you go to your site and it doesn’t load. In fact, it just times out completely.
You can still ping your server, but your site is fully off-line. You open a ticket with the helpdesk and they inform you that your server is overloaded due to a large spike in bandwidth. This resulted in your server running low on memory and your server crashing. The engineers fix the issue and inform you that you may want to consider some monitor software that will constantly check to see if your server is having issues other than a failed ping. The entire process takes about an hour to get the server back online.
Let’s look at the situation with monitoring software. You start a promotion on your website. As your promotion gets into full swing, you receive an email notification stating that your server is alarming for multiple items. The engineers inform you that your bandwidth is beginning to max on the server which results in your server running low on memory. The engineers schedule a time with you to take the server offline and increase the memory in the server. Your site is down 15 minutes for the upgrade and back online within minutes. Your promotion never skips a beat and your customers never even notice the issue.
If having backup software for your server is like having insurance on your car, then monitoring software would be like having a super, upgraded alarm system in your car that checks your oil level, your tire pressure, your electrical components in your car among many other tiems
Do you have monitoring software? If not, go to http://turnkeymonitoring.com/ and you can see some of options available to you.
Until next time Turnkey Lovers…
TurnKey Internet, Inc Announces Customer Loyalty Bonus this Black Friday – up to 80% Off for Life no comments
LATHAM, NEW YORK (November 24, 2014) – Sustainable IT solutions provider TurnKey Internet, Inc. announced today the launch of their 2014 Black Friday Deals offering some of the best cloud services, datacenter facility services, and web hosting offers for 2014.
TurnKey Internet is known for running its eagerly-awaited, industry-leading Black Friday specials, and this year is no different. TurnKey Internet is offering 80% off for the life of nearly every product they offer – cloud servers, dedicated servers, virtual private servers, cPanel web hosting, Microsoft Windows web hosting, enterprise colocation services, SEO optimized web hosting, and much more. More information can be found at https://www.turnkeyinternet.net/bf.
In a bold move in contrast to other companies that exclude the best seasonal deals from existing clients – TurnKey is also rewarding its existing clients with a loyalty bonus for any client wishing to take advantage of a new cloud service by making the 80% off for life deal available on a new purchase and adding on a bonus free month of service for every previous year of loyalty.
“We love our loyal clients and I’m truly excited this year to announce our loyalty bonus during this promotion.” Remarked Adam Wills, CEO of TurnKey Internet. “While cable companies offer intro 12 month rates to new clients only, or the phone companies exclude existing clients from the best deals, TurnKey is focused on giving back to our loyal clients with added bonuses on top of access to the very best cloud hosted services. Last year exceeded all expectations- and I am happy we are offering deals like these to our valued clients and potential new clients across the globe. “
About Turnkey Internet
Founded in 1999, TurnKey Internet, Inc. is a full-service green data center and leading provider of sustainable web hosting and IT solutions. From its SSAE 16 Type 2 and ENERGY STAR® certified facility in Latham, NY—New York’s Tech Valley Region—TurnKey offers web hosting, communication services, web-based IT systems, software as a service (SaaS), enterprise colocation services, and computing as a service to clients in more than 150 countries. For more information, please call (518) 618-0999 or visit www.turnkeyinternet.net/media.
Net neutrality, as defined by the Federal Communications Commission is: “The ‘Open Internet’ is the Internet as we know it. It’s open because it uses free, publicly available standards that anyone can access and build to, and it treats all traffic that flows across the network in roughly the same way. The principle of the Open Internet is sometimes referred to as “net neutrality.” Under this principle, consumers can make their own choices about what applications and services to use and are free to decide what lawful content they want to access, create, or share with others. This openness promotes competition and enables investment and innovation.”
There are basically only three parties to the whole internet/net neutrality equation:
- Retail Traffic carriers: such as cable companies, phone companies, satellite companies, etc.
- Content providers: such as Netflix, YouTube, Hulu, Spotify, etc.
- End users: you and me
The whole net neutrality issue has come about because the traffic carriers, want to charge content providers a fee for “fast lane” service to deliver their content to the end users.
So what’s wrong with this picture? Here’s what’s wrong with it…. You and I pay a monthly fee to a traffic carrier for X amount of bandwidth. You and I want to watch a movie or listen to a song from a content provider. Hey, we have paid a fee for our bandwidth so we rightfully expect to be able to utilize that bandwidth whenever we want to. The problem is, the traffic carriers have oversubscribed their networks. Let’s say a traffic carrier serves a neighborhood of 1000 homes and they give each home 5Mbps of bandwidth and, the total bandwidth that the traffic carrier can carry on their main circuit to that neighborhood is 500Mbps. This means that if 100 homes are utilizing the full 5Mbps of bandwidth they have paid for, the other 900 homes will have no bandwidth and won’t be able to use the internet. Now, bear in mind that this is the theoretical limit. Since data is moved in bits and pieces, all 1000 homes in the neighborhood would have at least some access to the net but if everyone maxed their connection at the same time, everyone’s connection would slow to a crawl since the maximum available bandwidth in this example is only 500Mbps. The traffic carriers are gambling that only a certain percentage of the end users will be online at any given point in time and that only a certain percentage of end users will be using the maximum bandwidth that they have paid for at any given point in time. Quite frankly, that’s a workable model and one that has prevailed over time. The problem is – what if all the end users want to use their fully allotted bandwidth all at the same time? If that happens, then the traffic carriers cannot provide what the end users have paid for because they don’t have that much bandwidth. This is known as oversubscribing your network.
So, what the traffic carriers want to do is to charge the content providers in order to give the content providers’ data, priority over other types of traffic. This is completely wrong because the content providers host their servers at a data center (or multiple data centers) and they are already paying the data centers for all the bandwidth they need. It is the traffic carriers who have oversubscribed their networks and yet it’s the traffic carriers who want to charge the content providers.
The content providers are not the problem. The content providers have paid their hosting company(ies) for ample bandwidth to move their data to the end users. It is the traffic carriers who have oversubscribed their networks that are causing the issue. So, the traffic carriers want to charge the content providers to prioritize their traffic. Which raises another question… if the content providers’ traffic is prioritized over other traffic, then what does that do to your VoIP phone service, or the content you’re trying to get from a company that doesn’t pay to have their traffic prioritized, or your email? What needs to happen here is there should be no charge to the content provider. The traffic carriers need to increase their overall capacity so that the end users can download whatever they want whenever they want it at the maximum speed that they have paid the traffic carrier for.
Think about this… if your next-door-neighbor downloads content from a content provider who has paid the traffic carrier a fee to prioritize their traffic and you are downloading something from a content provider who has *not* paid an additional fee to the traffic carrier, you could see your download slow to a crawl while the traffic carrier prioritizes the traffic of the paying content provider over that of the traffic from a nonpaying content provider. Under this scenario, you, and the content provider you are using, are both being penalized so the traffic carrier can prioritize the data of the paying content provider.
If traffic carriers are allowed to charge content providers to prioritize their traffic, that may become an insurmountable barrier to countless new businesses that could potentially exist. Think of it this way… suppose content providers have to pay traffic carriers to carry their traffic – in this scenario, what happens when someone comes up with a new idea, like youtube? Can you imagine a couple kids in a garage that come up with a great idea but next to no one can download the content because the kids are running a startup and they don’t have the type of cash needed to pay traffic carriers to prioritize their content?? Think about how that will stifle competition and not allow for the latest and greatest ideas to get out to the public.
This will be a multiple part series since there is so much ground to cover on this topic. Stay tuned for more.