Google Search Console URL Inspection API
Google has announced the debut of its newest Search Console URL Inspection API. The search company reports that SEO experts will analyze URLs in masse, automate page debugging, and improve page load time. The Jan 31st launch is excellent news for SEOs and web developers who can benefit from faster URL inspection efficiency. On their developer’s page, Google mentioned that the newest Search Console API will expediently allow developers to optimize and debug any issues.
When I wrote this, the most valuable features of Google Search Console for detecting and correcting SEO issues were now the Index Coverage report and the URL Inspection API.
The URL Inspection Tool summarizes critical issues on your site that either block Googlebot from crawling your content or cause user experience problems. It also lists URLs with these issues and their priority level. The tool then guides you through fixing each issue so that Googlebot can access and index your site content.
Get More Valuable Data the URL Inspection API
The abbreviation API stands for Application-Programming-Interface. Experts regard this as a link between two software applications that allow them to interact. For example, external products and applications can use Search Console APIs to access data beyond the Search Console.
SEOs and developers are already taking advantage of the various APIs available to them by creating unique solutions for viewing, adding, and removing sitemaps and properties. Using Search Console’s new API, they can run sophisticated analytics on search performance and query Search Console for data about a URL’s indexed version.
The results will not only show you information on the AMP, rich results, index status, and mobile compatibility of any URL that you’ve validated in Google Search Console (formerly Google Webmaster Tools). Still, they will also return information for any number of URLs submitted—so long as you have access rights to submit those URLs, to begin with!
The following are some examples of the kind of information you can access when you use the URL Inspection API;
- Accelerated Mobile Pages (AMP)data
- Google-selected canonical
- Indexing status
- Is Indexing Permitted?
- Is URL crawling permitted or prohibited in Robots.txt
- Most recent crawl status and time
- Mobile compatibility (pass or fail)
- referring URLs
- Rich Results
- Whether the URL is in the sitemap or not
- Canonical URL declared by the user
How to use URL Inspection API
To analyze a URL through the API, you must configure the property URL and page Uniform Resource Locator (URL) in Google Search Console. Here is an example of how a JSON description of a request body would appear.
{
“inspectionUrl”: “https://blogiestools.com/category/news/”,
“siteUrl”: “https://blogiestools.com”
}
Go to index. Inspect and hit the try it button on the right sidebar to see how well the URL inspection API tool operates. For execution to succeed, you must have the authorization to view the property in the Search Console.
You can set up live URL Inspection API by going to console.cloud.google.com, creating a project, then searching for Google Search Console API and enabling it. Next, configure credentials and complete the rest of the configuration procedures.
Usage Limits
The developer documentation has a complete overview of the Search Console API use restrictions. The limit is imposed for every Search Console website property (calls querying the same website multiple times) when using the URL Inspection API:
Since it has constraints, you can’t execute the API on each URL over every site in one day. The API’s daily limit is 2,000 inquiries, with a per-minute restriction of 600 searches. So, I don’t think it would be possible to test it against your complete one-million-page site right now. Instead, you’ll have to queue stuff up or do it case-by-case.
Key Takeaway
Google Search Console’s URL Inspection tool provides extensive information about a page. It displays the URL’s discovery in sitemaps, the time and date the page was crawled, indexing metadata including the user and Google-selected canonical, and schemas identified by Google.
Thanks to the URL Inspection API, SEOs and developers can now analyze sites in bulk and regularly develop automation to monitor crucial pages. It will be fascinating to observe how programmers leverage the API to create helpful custom scripts.
Feedback
Google is always eager to see the innovations SEOs and developers create through the Search Console APIs. Indeed, the new API will open up additional options for the industry to innovate with Google Search data.
How The Launch Will Impact SEO
Optimizing for local Search engine rankings should be a primary objective for every company. Therefore, every SEO needs to have the GSC or Google Search Console in their arsenal since it gives valuable information about how a webpage performs in Google’s search engine results.
Google solicited developers’ feedback while designing the new API, which resulted in functionality tailored to certain use situations.
SEOs can use the tool to monitor critical pages and debug solutions for particular pages. For example, you can see if there have been any disparities between the canonicals stated by visitors and those chosen by Google as per Google. Flaws with structured data on a collection of pages may also be recognized and rectified more rapidly, owing to the API.
Developers of CMS and plugins may then provide a template or page-level insights and continuing assessments for existing pages. Modifications to critical pages can be monitored over a prolonged period if desired. It will allow problems to be accurately diagnosed and corresponding fixes prioritized.
You should now check your site in GSC if you haven’t already!
Frequently Asked Questions (FAQs)
🤔 Despite the URL Inspection Tool saying “URL is on Google,” why is my URL not ranking?
The premise that Google has categorized your URL does not imply ranking. It typically requires more than simply that Google indexed a URL to rank and mention a few variables.
To mention a few variables: the page should feature high-quality content that fulfills the user’s goal. Internal connections and linkages from other sites are required.
🧾 My URL is showing an old state in the URL Inspection Tool. Why is that?
Google hasn’t yet crawled and reindexed your URL. You may accelerate the procedure by doing the following:
- Make doubly sure that the XML sitemap contains Make doubly sure that the XML sitemap contains the URL
- Improve the URL by including more internal links from high-authority pages on your website
- Links to the URL should be acquired from other websites
- Your URL should be included in your Google My Business post
🧐 Should the URL Inspection Tool take manual actions into account?
It doesn’t work that way. When troubleshooting ranking and indexing difficulties, keep this in mind. If in dispute, check twice the Manual Actions part of Google Search Console (opens in a new tab) to ensure nothing is listed there.
Furthermore, the URL inspection tool does not consider Removal Requests(a new tab will open).
At SEOinc, we have mastered Google Search Console’s new API and can use it to understand the types of traffic coming to your website so we can make adjustments where necessary. With us, you’ll have ready access to a wide range of cutting-edge resources that will help you stay competitive in today’s marketplace. So whether you need help with local SEO or want to explore the latest developments in content marketing, we’re here for you. Contact us today.
- About the Author
- Latest Posts
Garry Grant is a distinguished expert in search engine optimization and digital marketing, boasting over 25 years of experience in the industry. As the founder of SEO, Inc., Garry has successfully expanded the company’s offerings by developing innovative technologies and strategic solutions to address complex challenges. Please visit our SEO Company.
Garry provides specialized consulting services to select organizations with established in-house SEO and Paid Search teams. His expertise has proven instrumental in enhancing team performance, introducing cutting-edge strategies, and optimizing page speed, thereby equipping internal teams with industry-leading techniques. His client portfolio includes renowned enterprises such as SC Johnson, 20th Century Fox, Vegas.com, IGN, Walmart, Target, and Pacific Gas and Electric, among others. To schedule an initial consultation with Garry, please utilize the provided calendar link.