Blog Header Banner

Drupal Web Site Security Alert : Forged Password Reset URLs   no comments

Mar 24, 2015 @ 8:57am cloud security,Web hosting

turnkey_internet_hosts_drupal_web_hostingWhile you may have been urged by Drupal to update your software late in 2014 due to SQL injection attacks in compromised Drupal 7 sites. Drupal has released version 6.35 and 7.35 to address a few newly discovered vulnerabilities within their software.

Listed in an advisory by Drupal’s security team, Drupal stated one of the vulnerabilities they are addressing has allowed password reset URLs to be forged. This allows malicious users to gain access without knowing the password.
In Drupal 7 this vulnerability is segragated to sites where accounts have been imported or edited in ways that will result in the password hash, in the database being the same for multiple user accounts.

In Drupal 6 this vunlnerability can be exploited on sites where administrators have created multiple user accounts with the same password. As well as where accounts have been imported or edited in ways that will result in the password hash, in the database being empty for at least one user account. Drupal 6 sites having an empty password hash, or a password with an easily compromised string in the database are extreamly prone to this vulnerability.

The second vulnerability Drupal’s team has patched is the ability for malicous users to devise a URL, sending visitors to a 3rd party website.

Drupal modules use a destination query to redirect users to a new destination after completing an action. Malicious users can use this destination parameter to construct a URL that will fool users by redirected them to a 3rd party website. Several URL related API functions in Drupal 6 and 7 can be fooled into passing through external URLs when that was not the intention, leading to open redirect vulnerabilities.

This vulnerability is has been down played as a large amount of the destination parameter are not vulnerable to the attack. Although, all confirmation forms built using Drupal 7’s form API are vulnerable! Drupal has also stated some Drupal 6 confirmation forms are vulnerable too.

Drupal versions affected:

Drupal core 6.x versions prior to 6.35

Drupal core 7.x versions prior to 7.35

How to rectify these vulnerabilities? Update to the latest versions.

If you use the Drupal 6.x upgrade to Drupal core 6.35

If you use the Drupal 7.x upgrade to Drupal core 7.35

For those using TurnKey Internet’s Web Hosting with Drupal can simply login to your cPanel control panel, click on the Softaculous icon, and update your drupal version from there as well as from the Drupal Control panel of your installated copy on your web site.  If any questions contact our customer service team, or keep posted on our help desk at http://helpdesk.turnkeyinternet.net/

Written by admin on March 24th, 2015

Tagged with , , , , ,

Advanced Cloud Backup for Servers, PCs, and your Office to achieve full Business Continuity   no comments

Mar 1, 2015 @ 11:45am backup

cloudbackupIn the past, I’ve spoke to you about purchasing backup software. I’ve compared having backup software for your server like having car insurance. I’ve also spoken with you about the different type of backup technologies that exist. I went into details about a few different types of technologies. These were the following:

 

 

  1. Bare Metal backup/restores
  2. Cloud Backups
  3. Virtual Server backups

For more information, check it out at TurnkeyVault.com

You can read about these technologies on our blog here For this article, I will be talking about a new type of backup software. A brand new product that I’m very excited to announce that combines many of the different backup technologies and creates a new, hybrid backup system. A system that allows you to ensure business continuity for your customers. For those of you who do not know what business continuity is, I shall explain. Business continuity means that your business goes on even if your systems fail. Restores are therefore fast and even if the system is down, continuity means you still have access to key piece of business data. Maintaining business continuity should be one of your main focuses as a business owner.

 

Our new backup product helps to ensure business continuity by combing the best of the backup technologies presently on the market. To do this, our software must have address the following needs

 

  1. Ability to create server backups and workstation backups
  2. Ability to create backups locally as well as to the cloud
  3. Ability to only backup files that have changed in a system

turnkey-internet-backup-solutions

In order to show you the greatness of the new software, I will go through each ability listed above and show how these 3 key areas combine to provide you with unparalleled continuity for you business. Shall we begin?

 

Ability to create server backups and workstation backups

 

Server backups are the main component of any data protection installation. Servers are where all of the data resides. This includes current, recent and in many cases, older data. The server also is where operating systems, applications, configurations and system states reside. Ensuring these assets is the main job for any business owner or solution provider. Now while most business owners ensure that server backups exist, many over look their individual workstations. This includes the workstations in the field, home offices and satellite facilities. The data on theses PCS and workstations may contain important projects, critical documents, and irreplaceable creative works. Sometimes, even workstations in the home office or headerquarters may be overlooked, despite being tied directly into the corporate network.

 

Ensuring that you have backups for both the server and the workstations is one step to ensuring business continuity. Our new backup product allows for backups of the servers and the workstations. The backups can be stored locally on an in house backup server or backed up directly to the cloud, otherwise known as a disk-to-disk-to-cloud backup solution. This ensures that even if your local backup system fails, you will have another set of copies of backups stored in the cloud that you can easily deploy. This leads us directly into the next feature that is required for business continuity.

 

Ability to create backups locally as well as to the cloud

 

Until rather recently, in the last few years, the main option for backups were to do it all locally or on-premises. The backups were usually stored on a disk or even an additional tape drive. Larger businesses may have had another tier that sent backups off site for achriving. Research conducted by technology research firm of Gartner Inc, shows  that backups in an average data center only worked about 85% of the time. Remote offices were even worse at 75% of the time. Making matters worse, is that you do not know if you have a bad backup until you attempt to restore it.  With the introduction of the cloud, the game has changed. You can now backup fast and secure to a hybrid cloud backup. What do I mean by hybrid cloud backup?

 

The hybrid cloud backup or disk-to-disk-to-cloud, allows you to maintain an initial disk backup, which is still down in house, but has an additional tier that stores the backup in the cloud. This tends to be the best of both worlds as the cloud tier is scalable, easy to manager and guarantees data restores properly. Restoring from the cloud is perfect for remote offices that aren’t near the local disk backup. The main benefit of being able to restore a backup from disk or tape, is the speed of the restore. Our new backup software addresses the speed issue by only restoring the changed blocks on a system, which increases the speed of the restoration dramatically. How does only backing up the files that have changed on a system speed up the restore process?

 

Ability to only backup files that have changed in a system

 

What determines the speed of a backup? One factor would be the speed of the connection between the device or devices being backed up and the backup device itself. Another factor would also be the speed of the I/O(Input/Output) determined by the quickness of the disk. However, the biggest factor of them all when addressing the speed of a backup is the amount of data being backed up. Now, when you first create a backup, there is little that you can do to change the size of the data volume except for compression. The intitial backup copies the entire data set. There isn’t any way around this. Once the full backup is in place, maintaining the backup up is done by sending over only the changes of files.

 

Our new software does things a bit differently. Instead of backing up the changes to the file, the software instead only sends over changes to the blocks. Blocks are much smaller than a typical file. To put this in perspective, say you have a word document that is 300 KB in size. You edit the file and change one word in the file. A typical backup system will see the file has been changed and resend that entire file over to the backup software. That essentially won’t change the size of the file. What our new software does is looks at the changed block, representing the one word and send that over to the backup. This may not sound like a lot, but say you have thousands of files and have to resend every file, every time it has been changed. The size of the data will add up very quickly.  The sending over of updates to only blocks of data is called Deep deduplication. Deep deduplication allows for a great savings in disk, or if you’re backing up in the cloud, service costs. It also means your network isn’t bogged down by having to transport massive data sets. Also, these smaller data volumes allow for increase longevity to your local backup system as you’re taking maximum advantage of your space.

 

 

To summarize, our new backup software addresses the main issues when discussing business continuity. Having business continuity for any business is a competivite advantage that every serious business owner must consider. Having a disaster recovery solution in place that you can deploy in a matter of minutes, will go a long way to ensure that your business is running at optimum efficiency. We’re calling our new backup software(Insert name of backup software). You should head over to (Insert URL to backup software) purchase some business continuity insurance for your business

 

For more information, check it out at TurnkeyVault.com

The SSL POODLE that Bites – SSL 3.0 Issues for web sites   no comments

Feb 22, 2015 @ 11:20am internet security,Web hosting

PadlockWhen I say POODLE, what do you think of? Is it a fluffy dog? In most cases, I would be referring to the fluffy dog, but for this article, we will be focusing on a security vulnerability. I’m not sure if you’re aware, but if you’re currently using SSL version 3.0, you will need to perform some updates to your SSL daemon on your server. SSL stands for Secure Sockets layer. A SSL is what every ecommerce site should have. It allows for you to securely process payments through your website. In fact, if you’re taking orders from your clients, you should be using a SSL. SSL’s add another layer of security and trust for your clients. If you’ve not read my post on PCI compliance and you’re running an ecommerce site, you should read my post on PCI compliance here: (Insert link to PCI compliance post)

 

With SSL’s as with any piece of software on the internet, there are different versions. SSL version 3.0 is nearly 18 years, however, SSL version 3.0 is no longer secure and remains in widespread use across the internet. Nearly all browsers support SSL version 3, and in order to work around bugs, within HTTPS servers, browsers will retry failed connections with older protocol versions, including SSL 3.0. This retrying of failed connections within SSL v3, allows the POODLE exploit to be initiated. This POODLE exploit works due to the nature of the failed connections and allows for a possible leak of your customers data when processing orders. You can read more about the specifics of the attack here:

 

http://googleonlinesecurity.blogspot.com/2014/10/this-poodle-bites-exploiting-ssl-30.html

 

Browsers and websites should turn off SSLv3 in order avoid compromising users’ private data.  The most straight forward method is to disable SSL 3.0 entirely, which you can see how to do at the links below, however, this can cause a myriad of computability issues. Therefore, the recommend plan of option is to enable TLS_FALLBACK_SCSV. Using the links below, they will show you how to properly secure your servers SSL daemon. These options resolve the issue of retrying failed SSL connections. It also prevents hackers with knowhow from downgrading from TLS 1.2 to 1.1 or 1.0.

 

 

For WHM/cPanel servers –  https://documentation.cpanel.net/display/CKB/How+to+Adjust+Cipher+Protocols

 

For DirectAdmin servers – http://forum.directadmin.com/showthread.php?t=50105

 

For Plesk servers – http://kb.sp.parallels.com/en/123160

 

Written by Jeremy on February 22nd, 2015

Tagged with , , , , , , ,

How to Setup a Firewall on your Cloud Server – CSF / CPanel, and more!   no comments

Feb 21, 2015 @ 12:02pm cloud security

firewallI have a question for you. Does your server have a firewall running on your server? For those who do know what a firewall is, let’s go to our good friend Wikipedia:

‘In computing, a firewall is a network security system that controls the
incoming and outgoing network traffic based on applied rule set. A firewall establishes a barrier between a trusted, secure internal network and another network (e.g., the Internet) that is assumed not to be secure and trusted.’

As avid readers of the blog know, I like to ground these ideas with every day analogies. You can think of a firewall like a door to your home. When the door is opened, people can walk directly into your house. Should you want to keep people out, you close and lock the door. This is the way a firewall works on a server. You place the firewall onto your server to keep intruders from the internet from accessing your data.

Firewalls can be either hardware or software based. If you go with a hardware based firewall, the firewall is connected to your switch that allows for traffic to be filtered upon a rule set you determine. You would use a hardware based firewall if you had a dedicated server. A software based firewall is installed within your server. It still blocks traffic based off rule sets you create, but it just does it from within the server and not out in front like a hardware based firewall.

For the rest of this article, I will provide you the steps to install CSF, which is short for ConfigServer Security and Firewall. This firewall is supported across many different Operating Sytems, RedHat Enterprise, Centos, CloudLinx, Fedore, Virtuozzo, VMWare, to name a few. You can read more about the supporeted systems here: http://configserver.com/cp/csf.html

This firewall can be installed with the following steps on your Linux based server:

mkdir /usr/local/src <– Creates the directory to install CSF

cd /usr/local/src <– Changes your location on the server to the newly created directory

wget http://www.configserver.com/free/csf.tgz <– downloads the CSF software to your server

tar xfz csf.tgz <– Extracts the software
cd csf <– Changes your location on the server to the CSF directory

./install.sh <– Installs the CSF firewall

CSF, when installed, and configured properly, places a preset list of rules onto your server. These rules can be configured directly within the csf.conf file or the csf configuration file. If you have a cPanel based server,  you want to ensure that you have the following ports opened for inbound and outbound:

# Allow incoming TCP ports
TCP_IN = “20,21,22,25,53,80,143,443,465,587,993,995,2078,2082,2083,2086,2087,2095,2096”

# Allow outgoing TCP ports
TCP_OUT =”20,21,22,25,37,43,53,80,110,113,443,465,587,873,995,1167,2086,2087,2089 ”

Those ports cover most of the ports you will need for your cPanel or non-cPanel server to function. You can read more about ports and their functions here: http://en.wikipedia.org/wiki/List_of_TCP_and_UDP_port_numbers

Once you do that, you may want to limit the amount of connections each user can make to your server. This can be set by changing CT_Limit in your csf.conf to the number of connections you want each user to be able to make. For example, CT_Limit = “150” will only allow each user to make 150 connections to your server.

You may also want to remove port 22 from TCP_IN along with setting your SSHD_config file to do only public_key authentication. Why would you do this? This will lock down your server from the outside and only allow people who have SSH keys installed into your server to gain access using SSH.

CSF can be configured in a multitude of ways to add another layer of security to your server. I highly recommend going to http://configserver.com/cp/csf.html and using the forums to learn more about the many features of CSF and how tweaking the settings can help ensure you’re providing a stable, safe and secure server environment

Written by Jeremy on February 21st, 2015

Tagged with , , , , , ,

How Important is your Data? Server Backups in the Cloud Explained   no comments

Feb 20, 2015 @ 11:40am backup
web hosting and server backups

server backups

If you’ve been a follower of the blog, you know that I’ve written a post on the importance of having backups of your data. I compared having a backup solution to having insurance on your automobile. This post was a generalized approach to backup solutions. For this week’s post, I will delve deeper into the realm of backups. More specifically, we will discuss the different types of server backup options that currently exist. This post will be of a more technical nature then my previous posts, but I assure you, if you stay for the entire post, you will have a better idea of server backups and the myriad of options that are available to you.  

 For more info and to setup cloud backups for your server, visit http://www.turnkeyvault.com

 

Shall we begin? There are a few different methods that exist for creating server backups:

 

  1. Bare metal backup/restore
  2. Cloud backups
  3. Virtual server backups

 

I will go through of each these methods to give you an inside look into each option. Let’s dive right in with bare metal backups and restores.  

 

Bare Metal backup/restore

 

In disaster recovery, a bare metal restore is the process of reformatting a computer from scratch after a catastrophic failure. This process entails reinstalling the operating systems, applications and if possible, restoring data and all settings. Bare metal restores allows you to restore to an uncofingured server as the backup includes all information to setup the machine and move the data over. This results in a ready to go backup server.

 

At a deeper level, bare metal backup/restores work by taking a “snapshot” of the server. This snapshot includes every file and folder that exists on the server including all hidden files and directories. This snapshot is then pushed to an offsite location where the entire image can be deployed at a moment’s notice.  If you have a Windows server or even a Linux server, bare metal restores will copy the entire operating system structure. Usually, these backup images are the rather large as they are an exact replica of the your running server.

 

For example, let’s say that you have a full power outage at your company. Upon the power returning, you realize that your main hosting server has lost all data. It can’t find the boot record to load the operating system and all files have been removed. Since you’ve purchase a bare metal backup solution, which you can view our current offers here: http://turnkeyvault.com/server_backups.php , you simply login to your bare metal software. You then select the server you want to restore and viola. The operating system is re-installed with all applications. It’s as if you’ve never had the major system failure

 

Cloud backups

 

When I say cloud backup, what immediately comes to mind? I personally imagine a white, puffy cloud in the sky that resembles a vault. Was that what came to mind for you? If not, that’s quite all right. A cloud backup is a piece of software that takes a snapshot of your server and then stores the backup in the cloud. What exactly do I mean by the cloud? The cloud is a piece of software that is stored off-site that can be accessed from any location. Cloud backups allow for greater flexibitily then a local disk or tape backup. A disk backup or tape backup has the limitation of only being able to access the data locally. This could mean data is being stored on a different server that is stored in your local office. In order to access the backup, you would have to drive into your office, connect the two servers and then migrate the data over.

 

Do you already see the disadvantage to this type of local system? What if you’re traveling and have a disaster and need to restore your data? How will you do it if your business only keeps local backups? This is where a cloud backup comes into play. Since the backup is stored offsite and can be access via an internet connection, you can restore your data from anywhere in the world. This allows for greater flexibility in your backup solution. Also, another disadvantage to local backups is the size or space requirements for the backups. Say you have 1TB of data you need backed up, but you only have 500GB worth of space. What will you do? More than likely, you would just add a new device to your backup software. This may be an additional hard drive, a USB drive or maybe a network attached storage.

 

Well with a cloud backup, depending on your vendor, you can usually just increase the resources of the cloud storage to accommodate your increasing space needs. This allows for you to be able to rapidily add more space to your backup server to accomdate  your increasing data space requirements. Now, in no way am I advocating that you should remove your local backup options, but instead add another layer of backups to your current system such as a cloud backup. Having local backups and cloud backups are a GREAT way to maintain business continuity.

 

 

Virtual server backups

 

Virtualization is one of the best things ever done for servers, as it allows for one physical server to act as several servers. This dramatically reduces computing costs and boosts efficiency. One of the main challenges with the backup of virtulized servers is the backup of the virtual servers data and the main hostnode data. When I say, hostnode, I’m referring to the original server that contains all of the virtualized servers. The reason you need to keep backups for both the host and the virtual servers can best be summed up with an example.

 

Your business has decided to virtualize all of the servers in your office. Your IT department recommends going with VMware. Fast forward a few months and you have a major system failure within the host node. Your main hard drive dies and you lose all virtual servers that were stored on the server. Luckily, you have a backup of the hostnode and just restore the backup for the hostnode, however, upon checking the server, you notice an error. Sure, your main host node sytem files were restored, but all your virtual servers data is missing.

 

This example illustratres the need to have a backup of the physical hostnode and the virtual servers. The physical hostnode contains the system files that VMWare needs to run. The virtual servers would also need a backup to restore the user data that has been created in the virtualized server. Usually the virtualized servers have a different type of operating system then a normal, non virtualized server would contain. You would need some virtualized server backup software  that can handle creating backups of the virtualized servers as well as, the main host node itself.

 

You could have local backups of both the hostnode and the virtualized server that you can restore. You could go the bare metal route for the host node as well as virtualized servers or even the cloud backup method. It’s just important that you have backups of both the node and the virtual servers.

 

 For more info and to setup cloud backups for your server, visit http://www.turnkeyvault.com

Hopefully after reading this post, you feel a bit better about the different backup options that exist and can come up with a backup solution that fits your company needs.

Meet The TurnKey Team – Griffin   no comments

Feb 14, 2015 @ 11:36am Staff Interviews

Griffin is from Williamsport, Pennsylvania where he joined the United States Army as a Signal Support Specialist. He learned many aspects of the Technology field during his six years of service. With two tours overseas, Griffin was able to perfect his craft in installations and in radio communications. Upon returning home, Griffin went to college to learn more about technology. He recently graduated with his first degree in Mobile Applications Development, and did not stop there. He is currently finishing from his second degree in Networking.

I wanted to learn as much as I could possibly learn, but the only way to gain a craft such as this is hands on experience. I was given a flyer for a job at Turnkey Internet. I was very excited to get an opportunity to work for the company. There is so much knowledge within Turnkey Internet. I love the people that I work with. It is a very personal work experience, and I have been able to learn so much from the employees. I did not go out and find myself a job, but instead I found myself a home with Turnkey Internet.

Written by admin on February 14th, 2015

Tagged with , ,

Meet the Staff – Randi from the TurnKey Customer Service Team   no comments

Jan 29, 2015 @ 11:30am Staff Interviews

randi_turnkeyRandi comes to us from the snow making capital of the world, Hunter Mountain, NY. Randi has always had a strong fascination with electronics and how everything works. What makes it go from point A to point B. What happens if we modify the algorithm?

Unlike most children, Randi’s least favorite Holiday growing up was Christmas. Randi’s brother always received electronic toys. Powerized car and trucks, flying helicopters, a hover board. Randi was left with Barbies and play sets.

Randi’s brother would get bored with the electronic toy and go play with another. At this point, Randi would take the toy into her room to try and figure out how it worked. The toys always ended up in a hundred pieces on her floor. She had to figure out how the toy worked. Never being able to put it back together, she was grounded by dinner each year. Randi didn’t want to be a doctor, no need to take Barbie apart.

Knowing Randi’s strong desire to understand how things work and her love for electronics, it was only natural her parents pushed her to attend a technical school. Randi studied at the Withlacoochee Technical Institute in Florida. Then continued on to attend Mildred Elley and the Sage Colleges of Albany her in NY. Randi graduated at the top of her class with a degree in information technology.

Written by admin on January 29th, 2015

Tagged with , , ,

Search Engine Ranking Benefits through SEO and IP Addresses – Google Says So!   no comments

Jan 27, 2015 @ 10:45am Web hosting

network-security-in-the-cloudDid you know in August 2014, Google announced that HTTPS would become a ranking credential? This is news worthy of mention and note as Google rarely reveals ranking criteria – and more so confirms that the SEO optimized hosting at TurnKey in fact  does raise your rankings for SEO systems (at least with google, now confirmed).

You can read more about Google’s comments here:  HTTPS as a ranking signal here. [PDF ] 

HTTPS is an added layer of encryption that Secure Socket Layers on top of HTTP or web traffic. This adds additional security to standard HTTP communications or web communications. SSL certificates are required for
e-commerce sites especially if you’re desiring your site to be PCI compliant. You can see my post on PCI compliance here: https://blog.turnkeyinternet.net/web_hosting/pci-dss-compliance-in-the-cloud-for-web-sites-servers-and-colocation/ .   Having an SSL certificate  is essential on an ecommerce site because of the secure transmission of sensitive information like credit card numbers, personal information, and login accounts.

If your website or blog begins with https://, you have likely received an uptick in Google’s rankings. This is currently a lightweight signal meaning that it doesn’t affect your site rankings greatly, but experts believe it will become stronger in the near future.

To turbo charge your web site’s rankings, be sure to use SSL certificates with a dedicated ip address on your web site, TurnKey offers an all-in one ‘turnkey’ solution to this to help you increase your search engine ranking with our Turbo SEO cPanel Web Hosting that bundles in multiple dedicated class-c ip’s and ssl certificates for one low cost in a simple to use interface.

Do You Need an SSL Certificate for Your Website?

REQUIRED: All websites should have some form of protection on them. This form of protection can come in many forms, however, if you’re going to be taking any type of data from your customers such as credit cards, phone
numbers, emails, or any personal information, you need to ensure that the data is transferred securely. SSL’s remain one of the most robust ways to do this.

As an online merchant, it’s your responsibility to make your customers’ private information is secure. If you are storing credit card information in a database on your website so you can manually charge it later, then you need an SSL certificate to secure the credit card data stored on your server. If you have any sort of log-in form where customers enter a username and password, on top of sanitizing the input from the user, a SSL certificate is highly recommended.

NOT REQUIRED: An SSL certificate is optional if you don’t gather personal information and instead forward your customers to a 3rd party payment processor like PayPal. This can be done as simply as embedding a PayPal button to your website. PayPal uses their own certificate to encrypt customers transactions.  HOWEVER, you can still benefit with SSL for search engine rankings, so it’s worth the investment but not REQUIRED for this category.

What Webmasters Should Do Now?

Decide the kind of certificate you need: single, multi-domain, or wildcard certificate. (More on this in a minute.)
Use 2048-bit key certificates. Use relative URLs for resources that reside on the same secure domain. Use protocol relative URLs for all other domains. Don’t block your HTTPS site from crawling using robots.txt. Allow indexing of your pages by search engines where possible by avoid the noindex robots meta tag.

Purchase an SSL Certificate from TurnkeySSL.com

Turnkey Internet is a trusted reseller of GlobalSign SSLs. The GlobalSign SSL certificates includes domain validation, quick issuance, re-issues among many other options such as adding a full trusted bar in your browser that allows visitors to see your SSL is trusted across the web.

Turnkey Internet has multiple types of SSL certificates for secure communication with business, system, portals, mail and more.

Our TurnKeySSL alpha certificate  is ideal for small business, blogs, and personal websites which costs $29 per year.

The TurnkeySSL Professional certificate is  ideal if you wish to have multiple subdomains covered (example: corp.yourdomain.com and web.yourdomain.com). This Pro level SSL certificate has full organization vetting which provides higher levels of trust and includes a malware site scan service. This is also preferred for service providers and SEO companies. $150 per year.

Lastly, TurnKeySSL Extended Validation (EV) Certificates are the most secure and offer visitors the green bar and enhanced sales. It also includes malware site scan service. The green address bar that comes with a TurnkeySSL Extended certificate prominently displays your company name, providing immediate trust and improving customer conversions. This certificate is $899 per year.

Keeping your certificate always up to date is recommended as you never want your clients receiving any SSL warnings when purchasing a product from you. Ideally,you would set the certificate to auto-renew annually. You can always check the expiration date by clicking the padlock symbol and then “View Certificate”. Test your entire checkout process in Firefox, Google Chrome, and yes, even Internet Explorer.

Once again, you can go directly to turnkeyssl.com to purchase any of the mentioned SSL types above

To turbo charge your web site’s rankings, be sure to use SSL certificates with a dedicated ip address on your web site, TurnKey offers an all-in one ‘turnkey’ solution to this to help you increase your search engine ranking with our Turbo SEO cPanel Web Hosting that bundles in multiple dedicated class-c ip’s and ssl certificates for one low cost in a simple to use interface.  Learn more
Until next time…

Written by Jeremy on January 27th, 2015

Tagged with , , , , , ,

Net Neutrality – If It Isn’t broken, don’t fix it! Part II   no comments

Jan 22, 2015 @ 7:18am Ask the Expert

netneutralityIf you haven’t read Part I you can do so at https://blog.turnkeyinternet.net/ask-the-expert/net-neutrality-if-it-isnt-broken-dont-fix-it/

Let’s touch on traffic prioritization.  What’s one of the easiest things to delay in order to prioritize other traffic?  Email.  After all, email doesn’t need to stream all at once like a song or a movie so it seems like the easiest thing to delay while prioritizing other traffic.  Now, suppose that email is queued up all day long until, say, 3am in the morning when traffic flow is at its lowest.  At 3am, these email servers begin sending all the emails they’ve queued up all day long.  No problem, right?  Heck, it’s only email.  Hmm, suppose you emailed that signed contract for the proposal and it had to be accepted by 5pm the day before but your email got queued up in order to make way for traffic that had been prioritized over the delivery of your email.  Are you beginning to see where this is going?

People have been using applications like twitter to instantly reach multiple people at the same time.  Suppose a town is under siege by a group that is trying to take over the town.  Suppose those town folks are alerting other family members and friends about the invading troops’ whereabouts so they can keep their loved ones out of harm’s way.  For the sake of argument, let’s say that twitter does not pay traffic carriers extra money to prioritize their traffic.  What happens now?  If your home is about to be invaded in five minutes and the message warning you of the impending invasion is not sent to you until 15 or 20 minutes later – it’s too late.  I know this is an extreme example but prioritizing traffic could actually become a matter of life and death.

President Obama embraced, almost exactly, the comments I posted on the FCC website concerning this matter.  In fact, what he is proposing is so close to my comments that I’m sure they must have been passed along to him.  In short, my comments were to keep internet traffic neutral – nobody’s traffic gets prioritized.  If you are a content provider and you want your traffic prioritized, then setup your own network and allow people to buy bandwidth directly from you and only your content is delivered over your private bandwidth.  The issue is though, it’s not the content providers who are causing the ruckus, it is the traffic carriers who have brought this whole issue about. 

Where President Obama strayed from my comments was in recommending that the internet become regulated as a utility such as the phone companies.  Let me just say this… “Dear God, save us all.”  I used to own a VoIP company and the myriad of taxes, fees and surcharges on phone service staggers the mind.  Do NOT let this happen to internet services.  Basically 31%-35% of your phone bill is comprised of taxes, fees and surcharges.  The FCC says that it wants to regulate Internet companies under Title II so they can control it.  I guess they (the FCC) hasn’t figured out that they already regulate the internet and VoIP – no Title II regulations are needed.

I say that the content providers should not be charged just because people want to download data from them.  The charge for providing end users for the bandwidth they need, should come from the traffic carrier.  This would be a good thing because once the price of a traffic carrier gets too high, someone will step in with a less costly way to provide bandwidth.  It’s called good old fashioned competition.  The traffic carriers want to make the content providers look like the bad guys.  And, as if that isn’t bad enough, the traffic carriers are already double-dipping on the profits.  Anyone who is younger than 35 or perhaps 40, may not remember the days of completely free TV.  TV was free because the networks made their money from the advertising space that they sold.  Then along comes cable companies that charge end users a monthly fee for TV shows AND they are collecting all the revenues from the advertisers – man, talk about a cash cow – they’re getting paid on both sides and now they want to charge content providers too?????  Are you serious??  And remember, the content providers are already paying their hosting company for all the bandwidth they need.  Why should they have to pay again?  The traffic carriers have oversubscribed their networks and that is a problem solely created and owned by the traffic carriers, and the content providers should not be held responsible for the traffic carriers problem.  Let me offer an analogy .  Let’s use a different scenario where there are three parties in the same roles:

  1. City water department – aka content provider (they provide water)
  2. Building contractor who builds houses on a huge tract of land she/he owns – aka traffic carrier
  3. End user – you and me who have bought homes from the building contractor

We purchased our homes and the contractor guarantees us 5 gallons per minute of water flow.  The contractor runs a six-inch main to serve the housing development.  The building contractor builds just enough homes so that the water flow to each house is 5 gallons per minute.  The problem is, the building contractor wants to make more money and so they overbuild the development and the water flow is now reduced to 2 gallons per minute just because there are too many homes using water at any given time.  So, the building contractor turns around and decides to charge the city water department a fee to prioritize water flow to certain homes, which of course will reduce water flow even further to homes that do not receive prioritized water flow.  The problem is not the city water department.  The city water department has more than enough capacity to serve all the homes in the development with 5 gallons per minute.  The problem is that the building contractor only put a six-inch main in and what is needed is a twentyfour-inch main.  This is not a home owner created problem nor is it a city water department created problem, the problem is the building contractor did not put in a large enough main pipe to feed all the customers.  And that is exactly what is happening with the flow of traffic on the internet.  The traffic carriers (building contractor) do not have the capacity to give the end users (home owners) all the bandwidth (water flow) that they guaranteed and the traffic carriers (building contractors) are turning around and charging the content providers (city water departments) a fee to prioritize their content (water flow).  The building contractors (traffic carriers) want to make the reduced water flow (bandwidth) appear to be a problem caused by the city water department (content provider).  And that is just not the case.

A peripheral but related issue here is the maddening amount of video and audio content that is displayed on websites.  How many times have you gone to a website and an advertisement begins to automatically play?  Also, how many times do you visit a site and want to read the story but there is no text, just a video?  That drives me nuts.  All I want to do is read the story, I do not want to watch a video.  Sorry, I’m going off on a tangent.

Let’s reiterate:

  1. Leave the internet as it is
  2. Do not regulate it like telephone companies are regulated unless you want to see prices increase due to fees, taxes and surcharges
  3. Leave the FCC in charge of the internet and VoIP

End of story, it truly IS that simple.

Written by Dave on January 22nd, 2015

Tagged with , , , , , , ,

Monitoring Your Dedicated Hosted or Cloud Hosted Servers   no comments

server-monitoring-ny-datacenterEarlier on the blog, I wrote to you about having backup software. I compared having backup software to having car insurance. You never know you need it until you actually need it. Does that make sense? I hope that last line wasn’t too confusing. Well, I have another question for you to start this article.

 

Do you currently have any monitoring software for your server?

 

Now, depending on where you host your website or rent your server from, the host may provide a basic type of monitoring software. For example, if you purchased a dedicated server, Virtual Private Server(VPS), Cloud Server etc. from us, Turnkey Internet, your server will be automatically be setup on a basic ping monitoring software. This works off using ICMP which is a basic protocol used across the industry to monitor servers. I won’t get too off-base with this post, so you can read more about ICMP at the link below:

 

http://en.wikipedia.org/wiki/Internet_Control_Message_Protocol

 

Now, you may be asking, “what if I bought a reseller or basic hosting account? Is that only ping monitored?” In our system, all our reseller and hosting servers have another level of monitoring attached to them. This includes ping monitoring, memory monitoring, drive space monitoring, snmp monitoring, and bandwidth monitoring to just name a few. We also can setup content checks. That means we can setup a monitor that will look if a site contains a word or piece of text. If it doesn’t find the word, the server will alarm for us.

 

You may be asking why? Why write an article on monitoring software? Well my friends, in slaying tickets each week, I come across many different issues across different clients. Some of these issues could have been prevented and others would have had a smaller impact if preventative measures were taken. Let me give you an example to really drive this one home.

 

Let’s say you have a website named, jeremysdomain.com and purchased it directly from Turnkey Internet with a dedicated server.  Your site will be used to as a life force for your business. You will take orders online. You place promotions online among other items. Next thing you know, you go to your site and it doesn’t load. In fact, it just times out completely.

 

You can still ping your server, but your site is fully off-line. You open a ticket with the helpdesk and they inform you that your server is overloaded due to a large spike in bandwidth. This resulted in your server running low on memory and your server crashing. The engineers fix the issue and inform you that you may want to consider some monitor software that will constantly check to see if your server is having issues other than a failed ping.  The entire process takes about an hour to get the server back online.

 

Let’s look at the situation with monitoring software. You start a promotion on your website. As your promotion gets into full swing, you receive an email notification stating that your server is alarming for multiple items. The engineers inform you that your bandwidth is beginning to max on the server which results in your server running low on memory. The engineers schedule a time with you to take the server offline and increase the memory in the server. Your site is down 15 minutes for the upgrade and back online within minutes. Your promotion never skips a beat and your customers never even notice the issue.

 

If having backup software for your server is like having insurance on your car, then monitoring software would be like having a super, upgraded alarm system in your car that checks your oil level, your tire pressure, your electrical components in your car among many other tiems

 

Do you have monitoring software? If not, go to http://turnkeymonitoring.com/ and you can see some of options available to you.

 

Until next time Turnkey Lovers…

Written by Jeremy on December 19th, 2014

Tagged with , , ,