Netpeak Spider 3.1: New Generation of SEO Reports
Articles,  Blog

Netpeak Spider 3.1: New Generation of SEO Reports


hello everyone today I want to tell you
about all the updates in netpeak spider 3.1 we have added several super useful
reports and of course improved the interface of the program I can’t wait to
show you all the updates so let’s open the new netpeak spider and get started if you want to make an in-depth analysis
of your or your competitors websites in context of segments in URL structure
then our new site structure report is the thing you should try for this task
exported file will consist of all the parameters that were selected in the
sidebar in the parameters tab but the most important part of this spreadsheet
is section column. it’s similar to the numbered list in any document and makes
it easy to sort and quickly determine the number and uniqueness of the segments
in URL we have also added this column to the extended copy feature which is
available in site structure report in a sidebar if you see a cell marked yellow
in URL column it means that this section doesn’t have a main page for example you
have a page example.com/tags/cats but you don’t have example.com/tags it means that example.com/tags will be marked yellow I
understand that it’s not a critical issue but better to be informed about
such things in your site structure to make an appropriate actions note that in
overview and site structure tabs all hosts are sorted in two steps first is
by dots fewer dots mean higher in the list and then we serve them in
alphabetical order all the subsequent segments will be sorted only
in alphabetical order but the most interesting thing about that site
structure report is an opportunity to visualize the data and I will show you
how to do it with xmind program all we need is just to select the necessary
part of the structure I will choose the whole structure just copy it and then
paste without formatting to the program let me erase all the data from this and
then paste without formatting thus you will have a mind map of all the links on
your website and also it’s available to separate the internal linking from all
the external links and subdomains ever since I have started working at Big
Sucker the most frequent user request has always been a report about red red
which consists of the page which contains the link leading to red rag
where this link initially needed its anchor and also status code and the
final redirect URL I’m glad to say that this report is available in our software
now with some additional information thus you can send it as a task for your
developers or content managers to get rid of all unnecessary red racks on your
website rigid by itself is not a critical problem but it’s much better to
lead visitors or search rabbits right to the target page without any time losses
also it’s crucial for saving your crawling project one of the most popular net pics body
use cases is scraping different kind of data from websites people most describe
contact details prices reviews or even goofs characteristics and etc and
actually ways how you can use scraping are limited only by your imagination
that’s why we needed to improve exploiting these data to make further
use of these reports easier for the customers let’s how scrape in summary in
a single file and data for scraping conditions in separate files reports
were born let’s take a deeper look on scraping summer and single file report
and the example of scraping their amazon’s product pages as you can see
each URL correspond with only one raw of the table and consist of that data
scraped from it this is one more advantage when you export this report in
Excel six format you will see small notes in column headers if you will
hover over it you will see that condition that have been used for
scraping this data if you’ve got a desk to get a complete
list of URLs of your or competitors websites quickly net big spider 3.1 will
help you in just a few clicks bending URLs report will have a list of
all web pages and they are adapt from initial URL
even before the crawling is complete by the way you can configure settings like
crawl in external links or subdomain that will also be considered in this
report let’s go deeper into each item of the new reports export menu and the
first item is current table results which actually performed the same
function as export button on the top of the main project tables it means that
you get a report with the current table results or a dashboard then we have main
report set it’s a bulk expert that gives you a folder which includes all issues
all results table site structure scraping overview and all the unique
URLs and anchors all issues it’s also a bulk expert that give you both the
reports about issues that have been found during the crawl if you don’t want
to get them all at once you can choose the necessary one from the issue reports
group I have already told you about site structure and pending URLs reports so if
you were looked at just reward this video then we have extra large reports
from database potentially documents from these reports can be really happy
because they consist of scraping results and also information about the links and
the last item here is all available reports the heaviest type of bulk expert
because it gives you all the available reports example the current table
results and then I want to underline – or some
reports that are usually overlooked they are placed in the extra-large reports
from the database and it’s all unique URLs and anchors and the first one will
be unique URLs where you will find the list of the unique URLs number they
appear on the website and also number of anchors used for this links it will help
you to give more information about that number of links to the external services
or how many different anchors used for these links and the second one is unique
anchors all the data here is grouped by the anchor text and we have also added
the number of pages where these anchor texts have been used so this report will
help you with a more in-depth analysis of the internal linking and also the
anchor list used for the certain group of pages if you are not sure if you have
enough time to grab a burrito in the nearest Bolen that big spider will help
in the latest update we have implemented crawling duration pockets with an easy
algorithm we take a number of pending URLs and divide it by a crawling speed
and then it sounds like a simple SAT task let expire the train leaves initial
URL station at a crawl and speed at what time will SEO specialists see his
reports complete if it’s 10,000 pages left and of course the expected time
left is located in the status bar we like to adjust everything around us
for our comfort and now let big spider tables are part of your working
environment that can be customized to let me show what you can configure first
of all it’s column width related position and also data sorting in any
column then I want to show you one feature that is usually overlooked by
our customers it’s data grouping by any parameter or group of parameters to do
so just drag column head into the special zone over the table and the last
feature is column freezing to use it just drag the column heading to the left
then I want to say that each table in the big spider can be customized
separately thus you can configure each issue report in the way you want it to
be displayed but I want to know that overview size structure scraping reports
can be customized separately because the big spider opened them in a URL Explorer
tab so you can have only one simultaneous appearance for them users
of netpeak checker 3.1 already know this feature but now it’s also available in
the big spider if you don’t want to see the whole data but serve only one master
parameters follow this easy three steps first one is when the crawling is
complete save the project after that Antechinus trip parameters and click on
sink the table with selected parameters button by the way as you can see we
replaced function names with buttons and also if you will hover over it you will
see the explanation of each function here we go only selected parameters are
displayed in the table I want to also say that if you forget what is hidden
and what is shown you always can open previously saved project in another hint
is you can save analyzed parameters as a
template and after some manipulations with sinking just use it again and click
sink the table with selected parameters here we go
all the data is shown again for the best experience used the whole functionality
of our program the same as previous feature it was implemented in that peak
check at three point one and then copy it in a big spider to make a long story
short it helps you filter the data by a value of one or several cells but only
in one row for example you can quickly get a list of pages which are located in
four clicks from initial URL as a summary let’s briefly run with all the
things we have changed in a big spider 3.1 most of the updates are about
reports as we have added four new items and gathered all the available reports
in handy books now you can customize the tables in the way you want and it will
be saved for further use even after the program is closed also we have added
crawling duration focused which will be displayed in status bar and copied
features like synchronizing table with selected parameters and filter by a
value from the peak shaker to netpeak spider and of course as usually we have
done more than 20 improvements in interface and usability to make your
user experience even better thanks a lot guys for your time and patience I want
to ask you for the one last thing I will be happy if you will do it can you
please write a feature that you have never heard before this video in the
comments below thanks a lot and we share hundred page speed inside score and
pretty snippet that cell

Leave a Reply

Your email address will not be published. Required fields are marked *