What is SEO Log File Analysis? A Beginner's Guide
Performing regular log file analysis helps SEO professionals better understand how their website is crawled and more. Learn the basics here.
Log File Auditing Tools & Their Differences | Lesson 20/34 | SEMrush Academy
Log files contain the history of every person or crawler that has accessed your website. You will learn how to deal with log files.
Watch the full course for free: https://bit.ly/3gNNZdu
1:01 Log File Analyzer
1:39 Elastic Stack
2:15 Other solutions
✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹
You might find it useful:
Understand how Google bots interact with your website by using the Log File Analyzer:
➠ https://bit.ly/3cs0rfC
Learn how to use SEMrush Site Audit in our free course:
➠ https://bit.ly/2Xsb3XT
✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹
Once you have your log files ready it is time to start dealing with the data. There are different ways to approach this. It is not a good idea to rely on a simple text editor to open this file. Often, log files are hundreds of megabytes in size so if you try to open that type of a file, you will have problems.
For a small site, you can start with DIY-solutions based on Excel or even Google Docs. You’d have to build filtering,cross-references, etc manually. So it is not scalable. There are no nice dashboards and graphs – you’d need to build that first; which is clearly not the simplest way to approach this.
One of the better ways – especially from an SEO perspective – is using Screamingfrog Log File Analyzer. It is a beginner level, desktop-based log file auditing tool with some very useful, pre-defined reports. It has a simple interface where you can drill down into different reports, understand crawl events and behaviour, see response codes, etc.
But it does not have any sharing capability. Also, you need to download log files manually from the server and import them into the tool. If the files are large it can take forever. It is a beginner level tool for small- and medium-size sites.
Another solution is the Elastic Stack – formerly known as ELK. It consists of 3 different tools:
Elasticsearch: search & analytics engine,
Logstash: server-side data processing pipeline,
Kibana: data visualisation (charts, graphs, etc.)
The great thing with this one is, that it’s open source and therefore free to use. On the other hand, you have to set it up and run it on your own server infrastructure. So, it needs IT resources.
Other SaaS solutions are logrunner.io, logz.io & Loggly. They are all based on ELK but focused on SEO based auditing, so they have dashboards where you can for example see the crawl behaviour over time or response codes per crawler etc.
The beauty of SaaS solutions is, that they are almost real time. You pipe your log files into the system and very quickly you can see what’s happening on your website.
It is important that working with your log files is integrated as a part of your regular SEO workflow, rather than having one-off audits. One-off audits are fine for a start, but logfile audits really become invaluable if you combine them with web crawl data and do it on an ongoing basis. Also, messing around with exports, up-/downloads is frustrating.
I’d generally recommend finding something that fits your requirements. All the tools have limitations; the price is based on the amount of volume that they process per month. The advantage of SaaS solution is the possibility of sharing reports with a team or with a client. If you’re doing migrations, it makes things easier because you see all the events that can happen in real time.
#TechnicalSEO #TechnicalSEOcourse #LogFileAnalysisTools #SEMrushAcademy
How to Do Log File Analysis | Lesson 4/7 | SEMrush Academy
Watch our latest video: How to Go Viral on Quora https://bit.ly/ViralOnQuora1
Subscribe to our YouTube channel! http://smr.sh/KdD
In this lesson, we’ll use the Log File Analyzer in order to gain an insightful and in-depth knowledge of how Google bot crawls your website.
Watch the full course for free: https://bit.ly/3d7lPIu
0:16 Log File Analyzer
0:51 Googlebot Activity graph
2:41 Summary
✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹
You might find it useful:
Apply your newly acquired knowledge by practicing with SEMrush and build up your real-world skills.
Go to Log File Analyzer:
➠ https://bit.ly/2LYBFJn
Get a comprehensive and detailed overview of technical SEO in our free course:
➠ https://bit.ly/2ZCU9Hp
✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹
After probing your website’s crawlability with the help of the Crawlability report within the Site Audit tool, it’s high time to examine the access logs of your web server.
In this lesson, we’ll use the Log File Analyzer tool in order to gain an insightful and in-depth knowledge of how Googlebot crawls your website.
Log File Analyzer
First, you need to download log files from your website. In most cases, this can be done via an FTP client. Once you’ve acquired your log files, make sure that they have a proper format – it must be a Combined Log Format – and don’t exceed 1Gb each. Then, go to the Log File Analyzer page and upload them. After a short while, your file will be uploaded and the report ready.
At the top of the report, you can see the Googlebot activity graph. By default, it shows how many times Googlebots hit your website daily during the entire period of logging. You can change the view by selecting a Googlebot and setting a period of time in the dropdown lists above. Then, use the selector to see the distribution of crawled pages’ status codes and found file types over the chosen time frame.
Below, you’ll see a list of your website’s crawled files and folders – it’s sorted by the number of bot hits by default. You can see and sort by a file type, the share of hits, crawl frequency for each of your files, and last crawl date.
To refine the list and see only specific elements, apply one of the following filters:
By path – if you need to find a particular file or folder and know its name
Last status code recorded – if you need to see which pages have, say, errors or redirections, and
File type
You can combine these filters as you like.
All this data provides you with valuable insights that help you:
Get rid of structural and navigational problems that affect the accessibility of your pages.
Track the occurrence of technical problems (broken pages, incorrect redirects, etc.) over a chosen time frame.
Optimize your crawl budget by finding areas of ineffective spending.
See how Googlebot prioritizes content and build up SEO strategy by drawing bots’ attention to the most important pages.
Plan your content production wisely, synchronizing your content creation plan with the actual timing of its emergence in SERPs.
Ensure that your website has gone through migration without loss, if this process took place.
You can upload another Log File at any time. If it’s an up-to-date version of an already uploaded file, the tool will automatically update the information contained in the report.
#TechnicalSEO #TechnicalSEOcourse #SEOlogFile #LogFileAnalysis #SEMrushAcademy
Server Log Files & Technical SEO Audits: What You Need to Know Presented by Samuel Scott
Server log files contain the only data that is 100% accurate in terms of how Google and other search engines crawl your website. Sam will show you what and where to check and what problems you may to need to fix to maximize your rankings and organic traffic.
What Is Log File Analysis for SEO
Prosperity Media’s Dejan Mladenovski shares his insights on What Is Log File Analysis for SEO.