xyz軟體王

xyz資訊工坊

會員登錄
最新消息
您現在的位置:網站首頁 >> 專業軟體光碟 >> Windows 系統應用軟體 >> 商品詳情
您可能感興趣:
Screaming Frog Log File Analyser 3.0-支持跨平台的日誌文件分析器
這是一款支持跨平台的日誌文件分析器,允許用戶將原始訪問日誌文件拖放到軟件界面中進行分析。Screaming Frog Log File Analyser能夠智能識別大型和慢速網址,找出未抓取和孤立的網址,需要的別錯過了哦!


功能介紹
日誌文件分析器雖然輕巧,但功能非常強大 - 能夠在智能數據庫中處理,存儲和分析數百萬行日誌文件事件數據。它收集關鍵日誌文件數據,以允許SEO做出明智的決策。

1、查找未抓取和孤立頁面

導入網址列表並與日誌文件數據進行匹配,以識別Googlebot尚未抓取的孤立或未知網頁或網址

2、識別已抓取的網址

查看和分析Googlebot和其他搜索漫遊器能夠抓取的網址,時間和頻率。

3、找到斷開的鏈接和錯誤

發現搜索引擎機器人在抓取您的網站時遇到的所有響應代碼,損壞的鏈接和錯誤。

4、改善抓取預算

分析您最常和最少爬網的網站和目錄,以識別浪費並提高抓取效率。

5、發現抓取頻率

深入瞭解哪些搜索機器人最常抓取,每天抓取多少網址以及機器人事件總數。

6、審核重定向

查找搜索機器人遇到的臨時和永久重定向,這可能與瀏覽器或模擬爬網中的重定向不同

7、識別大頁面和慢速頁面

查看下載的平均字節數以及識別大頁面或性能問題所花費的時間。

8、結合併比較任何數據

使用“URL”列對日誌文件數據導入和匹配任何數據。因此,導入爬網,指令或外部鏈接數據以進行高級分析


The Screaming Frog SEO Log File Analyser allows you to upload your log files, verify search engine bots, identify crawled URLs and analyse search bot data and behaviour for invaluable SEO insight. Download for free, or purchase a licence to upload more log events and create additional projects.

What can you do with the SEO Log File Analyser?
The Log File Analyser is light, but extremely powerful – able to process, store and analyse millions of lines of log file event data in a smart database. It gathers key log file data to allow SEOs to make informed decisions.

Some of the common uses include:
Identify Crawled URLs
View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently.

Discover Crawl Frequency
Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events.

Find Broken Links & Errors
Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site.

Audit Redirects
Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl.

Improve Crawl Budget
Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency.

Identify Large & Slow Pages
Review the average bytes downloaded & time taken to identify large pages or performance issues.

Find Uncrawled & Orphan Pages
Import a list of URLs and match against log file data, to identify orphan or unknown pages or URLs which Googlebot hasn't crawled.

Combine & Compare Any Data
Import and match any data with a 'URLs' column against log file data. So import crawls, directives, or external link data for advanced analysis.

站內搜尋

商品清單