DNS Monitoring: How to Check Your Traffic for Threats

DNS Monitoring

Cybercriminals are becoming more sophisticated in their attacks.

The Domain Name System (DNS) serves as a website’s identity and is the core component of its security architecture.

Unless your website has the appropriate DNS monitoring in place, there’s no reason why you cannot become a cybercriminal’s next victim.

We are offering informative tips on how to prevent security threats.

Why Do Cyber Criminals Target DNS?

Unfortunately, cybercriminals will target a vulnerable internet service or protocol, including a website’s DNS.

They can then register disposable domain names for a spam campaign or botnet administration.

What’s more, an attacker could use the domains to host malware or phishing downloads.

Malicious queries can also exploit a nameserver or disrupt a name solution.

Sadly, the cyber-attacks can potentially destroy a website’s performance, function, and reputation.

The servers of Dyn are a perfect example.

The company controls some of the internet’s DNS infrastructure. It experienced a cyber attack that brought down much of America and Europe’s internet on October 21st, 2016.

The new Mirai botnet attack has been classed as the largest kind in its history.

A variety of high-profile websites experienced a downtime, such as Twitter, The Guardian, CNN, Netflix, and Reddit.

While it may be a feat to prevent every potential DNS threat affecting a website, it’s essential to take action to avoid falling victim to a cyber attack.

Why DNS Monitoring?

More than a quarter of companies haven’t established responsibility for their DNS security, despite the fact DNS attacks have increased by more than 200%.

To prevent a website from becoming a cyber attack target, you must embark with regular DNS monitoring.

A DNS log monitors every connection your website makes with a visiting device.

To maintain website security, it’s essential to embark with DNS monitoring to inspect the traffic between a device and your local recursive resolver.

The forensic analysis can ensure you:

  • Identify the websites visited by an employer
  • Discover the malware/botnets connected to the C&C servers
  • Detect a DDOS attack
  • Pinpoint the Domain Generation Algorithm (DGA) and malicious domains accessed
  • Identify the dynamic domains accessed

When analyzing the DNS log, it’s essential to verify each domain against the DGA and malicious domain database.

If you’re unsure of where to start with DNS Monitoring, we’re offering six security systems to help you proactively protect your website.

1. Firewalls

Firewalls have the potential to expose DNS threats, so they’re an effective tool for DNS monitoring.

Most firewalls will allow webmasters to define rules to prevent IP spoofing.

For example, you could enter a rule that denies DNS queries from IP addresses outside an allocated number space. This could prevent a nameserver from exploitation in a DDoS attack.

It’s also beneficial to enable DNS traffic inspection for suspicious byte patterns or irregular DNS traffic, so you can take the steps to block a nameserver software exploit attack.

2. Traffic Analyzers

One of the best ways to identify harmful malware traffic is a passive traffic analysis.

A traffic analyzer will allow you to both capture and filter DNS traffic between a device and your local recursive resolver, which you can then save to a PCAP file.

Webmasters must create scripts to search the PCAP file to identify specific suspicious activities.

3. Passive DNS Replication

Passive DNS replication allows a webmaster to use sensors at the local recursive resolvers.

This creates a database containing each DNS transaction, such as the query or response, through a resolver or set of resolvers.

The replication can be instrumental in identifying one or more malware domains, particularly in cases when the malware operates algorithmically generated domain names (AGDA).

4. Intrusion Detection Systems

An effective intrusion detection system allows you to create rules that allow reporting on DNS requests from unauthorized networks.

It is beneficial to compose rules to either count or report:

  • NXDomain responses
  • DNS queries via TCP
  • Responses that contain resource records with short TTLs
  • Unusually large DNS responses
  • DNS queries to non-standard ports
  • plus more

All DNS queries should be carefully reviewed.

The intrusion detection systems can be integrated into firewalls, which will allow you to deny or permit rules for many of the checks listed above.

5. DNS Monitoring with Local Resolver Logs

Your local resolver logs are probably the most obvious and essential way to embark with DNS monitoring.

By enabling resolver logging, you can use a variety of tools to collect DNS server logs whilst exploring known malicious domains, such as OSSEC.

6. A Secure Registrar

Most websites are registered via a registrar company.

Unfortunately, if a cyber-attacker can compromise the account with the registrar, they can gain control over your domain name.

This means they can point the registrar to their chosen server, including their nameservers.

What’s more, they can transfer the domain to either a new owner or an offshore registrar – which means you might be unable to recover the domain.

Many intelligent cyber attackers may target an account’s password, or they may even launch a cyber attack on the registrar’s tech support.

You’ll want to avoid registrar hijacking, so you should select a registrar that provides heightened security precautions.

Look for services like multi-factor authentication.

Suspicious Signs to Analyze

It is important to pay close attention to any potential signs of malicious activity on your network.

We recommend analyzing the composition characteristics and length of DNS responses. This could help to identify malicious intent.

If the response messages are unusually large, this could be an amplification attack.

You should also review the answer or additional sections of the response message, which could be a sign of cache poisoning.

Conclusion

The biggest risk to a website is ignorance, which will not be bliss when you suffer a cyber attack.

There are various forms of DNS monitoring that will allow you to expose threats and keep your website secure.

It is up to a website admin to determine the right strategy to detect suspicious or malicious activity on your network.

While DNS monitoring doesn’t sound like a fun thing to do, it is essential for the security of your website.

Ensure you take the necessary steps to stop a cyber criminal in their tracks.

Why System Downtime and Slow Speeds Affect SEO

System Downtime

The world today is experiencing an increased demand for immediate gratification. People expect instant access to information, and the reasons aren’t hard to pinpoint.

Ever-increasing Internet speeds are allowing people access to information at an unprecedented rate. In a three year gap between 2011 and 2014, Internet speeds increased by 10Mbps.

To put that in perspective, in 2000 a mere 200kbps met the FCC’s definition of advanced Internet services.

Couple this with the 207 million smartphone users in the US, and it makes sense that attention spans are decreasing.

A study by Microsoft Corp. helps bring to light just how short our attention spans have become. People generally can’t focus for more than eight seconds. That’s a one-second shorter attention span than a goldfish.

This impatience also effects website load times. After all, what’s the point of fast internet access if web pages load slow?

An astounding 47 percent of users expect a web page to load in two seconds or less, and that number will surely rise.

Google long ago took note of this trend and incorporated site speed and system downtime into their algorithm.

Keeping pace in modern SEO means keeping pace with shortening attention spans and user demands for speed.

System Downtime

System downtime is never positive. Yes, site maintenance and other small issues force websites down occasionally, but Google remembers even the smallest amount of system downtime.

Understanding why Google punishes websites for their downtime is the key to minimizing its impact on SEO. We’re broken down Google’s rational into three main categories.

Google Loves Crawling

Google indexes your website with its “spider” tool that “crawls” your web pages. Put another way, Google checks your website for new content and backend updates.

When a website greets Google with an error code, for instance, a 500 internal error or 503 response, the website interprets your system downtime and adjusts your SEO rank accordingly.

In general, the longer your site throws an error code, the more Google will penalize your ranking. However, some error codes hurt more than others.

A Moz study found that the 500 internal server errors occurring intermittently caused keywords to drop out of both the top ten and top 20 rankings. The pages in question also received less “crawls” per day. Fewer crawls mean fewer opportunities for Google to record SEO signals and therefore worse SEO potential.

The 500 internal error was also found to wreak havoc during consistent downtimes. Domains dropped anywhere from 5 to 100 positions for tracked keywords.

The User Knows Best

Google is increasingly focused on providing users with the best experience possible. This has continued with the recent release of Penguin 4.0.

Google interprets that your website isn’t user-friendly if you’re dropping keywords and other backend metrics. Inconvenient website’s earn lower rankings.

Former Google employee, Matt Cutts, elaborated on the impact downtime has on user experience during a Google Q&A session.

Cutts said, “If your host is down for two weeks…there’s a better indicator that your website is actually down, and we don’t want to send users to a website that’s actually down.”

Keeping your website active is crucial to providing a positive user experience and winning Google’s favor.

If your website must go down, make sure to issue a 503 error. The error code tells the Googlebot and users that the downtime is temporary. Warning Google allows them to hold off on reducing your search rank.

Site Speed

Site speed is an almost entirely user-based metric. While Google factors things like keywords and links into relevance and other SEO signals, site speed is only factored into the end user experience.

We’ve already touched on how impatient users are, but that impatience that is critical to understanding site speed. In fact, the site speed metric exists because of user impatience.

The term site speed refers to how quickly a web page loads. Several metrics measure actual loading time:

  • Document complete time measures how fast a web page becomes interactive.
  • Fully rendered refers to when your web page is fully loaded with advertisements and all background elements.
  • Time to First Byte refers to how long your browser takes to receive the first byte of a response from a web server after requesting a URL
  • Page Size is the total amount of bytes that make up your page vs. how long the page takes to fully render

How to Optimize Your Speed

The goal of increasing site speed is tailoring your website to capitalize on each metric. For instance, the page size is an easy metric to optimize for. Decreasing your overall page size will usually decrease your time to full render.

Other metrics are harder to capitalize on. All load time metrics can benefit from a better host, but better hosting costs money. Likewise, optimal HTML structure and web compression are hard to implement for the average user, but both guarantee increased site speed.

Other, easier, methods for increasing site speed are avoiding flash and reducing your image sizes. It’s important to optimize for every metric possible when seconds matter.

Wrapping It All Up

Site speed and system downtime are two important metrics that Google uses to determine SEO rank. Each has an effect at the algorithm level, but it’s important to remember that Google’s focus is ultimately on end user experience.

Anything about your website that hinders user experience is likely to hurt your SEO.

Ask yourself this: are there any elements to your website that you dislike or that make use difficult? Does your website load slowly, is it always going offline?

If your answer is yes Google has already penalized your search ranking. Our product makes sure that website downtime no longer affects your search rank.

We understand that time is of the essence, and provide users with real-time website monitoring and instant alerts to any errors or issues. We also offer a custom API for deep integration into website diagnostic data.

Our companies goal is to keep your customers engaged by keeping your website online.

If you have any further questions about our product, please contact us here. We’re always eager to help new customers.