Google Search Console often shows WordPress files like /wp-includes/js/jquery being crawled. But should you block them in robots.txt? This guide explains why Google crawls these resources and why blocking them can harm SEO.
Why Google Crawls /wp-includes/ in WordPress (And Why You Should Not Block It)
If you manage a WordPress website and regularly check Google Search Console Crawl Stats, you may notice something that looks a little strange.
Googlebot repeatedly crawls URLs like:
- /wp-includes/js/jquery/jquery.min.js
- /wp-includes/js/jquery/jquery-migrate.min.js
At first glance, this can look unnecessary. These files are not pages, they are simply JavaScript resources used by WordPress.
So the natural SEO instinct is to ask: Should I block /wp-includes/ in robots.txt to stop Google crawling it?
The short answer is no. Blocking this folder is usually a mistake and can even cause SEO issues.
Let’s look at why Google crawls these files and what would actually happen if you blocked them.
What Is the WordPress /wp-includes/ Folder?
In a standard WordPress installation, /wp-includes/ is part of the core WordPress framework.
This folder contains essential resources such as:
- JavaScript libraries (including jQuery)
- core WordPress scripts
- CSS files
- functionality used across themes and plugins
Many WordPress themes and plugins rely on these files to run properly.
For example, a page may load a file like:
- /wp-includes/js/jquery/jquery.min.js
This is a very common JavaScript library used to power site functionality, animations, navigation elements, and interactive components.
Because these files are referenced in your page code, Googlebot will also request them when crawling your site.
Why Google Crawls JavaScript and CSS Files
Modern search engines do not simply read raw HTML anymore.
Google renders pages using a system called Web Rendering Service (WRS), which loads:
- HTML
- JavaScript
- CSS
- images
- other resources
This process allows Google to understand how a page appears to users.
When Googlebot encounters a script reference like:
- <script src=”/wp-includes/js/jquery/jquery.min.js”></script>
it may crawl that file so it can properly render the page layout and functionality.
This is why these files often appear in Search Console Crawl Stats.
It is not wasteful crawling. It is part of how Google understands modern websites.
What Happens If You Block /wp-includes/ in robots.txt?
Some website owners consider adding the following rule:
User-agent: *
Disallow: /wp-includes/
This tells search engines not to crawl anything inside that folder.
If implemented, Googlebot would stop requesting files like:
- /wp-includes/js/jquery/jquery.min.js
- /wp-includes/js/jquery/jquery-migrate.min.js
While this may reduce crawl activity slightly, it introduces a more serious problem.
Google may no longer be able to fully render your pages.
Blocking Render Resources Can Cause SEO Problems
Google has long recommended that websites allow access to JavaScript and CSS files needed to render a page.
If these resources are blocked, Google may see an incomplete version of the page.
Potential consequences include:
-
incomplete page rendering
-
difficulty understanding layout and content
-
reduced confidence in page quality signals
For example, if key scripts are blocked, Google may not correctly interpret elements like:
-
navigation menus
-
dynamic content
-
interactive components
-
certain layout structures
For WordPress websites that rely heavily on JavaScript, blocking /wp-includes/ can therefore create more problems than it solves.
Why You See /wp-includes/ in Search Console Crawl Stats
If you review the Crawl Stats report in Search Console, you may see entries like:
- /wp-includes/js/jquery/jquery.min.js
- /wp-includes/js/jquery/jquery-migrate.min.js
This is completely normal.
Google is simply requesting the same resources that a browser would load when visiting your site.
In most cases, these crawls:
-
return a 200 status code
-
occur alongside page crawls
-
represent a very small portion of crawl activity
For the vast majority of websites, this does not impact crawl budget.
Should You Ever Block WordPress Core Files?
In general, the safest approach is to leave WordPress core resources crawlable.
Blocking folders like /wp-includes/ or /wp-content/ can prevent Google from accessing important render resources.
Instead, robots.txt should usually focus on blocking areas that provide no search value, such as:
The Real SEO Priority for WordPress Sites
The Real SEO Priority for WordPress Sites
Rather than worrying about Google crawling /wp-includes/, SEO efforts are usually better spent on areas that actually impact visibility.
For example:
-
improving internal linking
-
strengthening topical content
-
optimising metadata
-
improving site speed
-
ensuring proper indexing signals
These factors will have a much greater influence on organic performance than trying to prevent Google from fetching WordPress JavaScript files.
Final Thoughts
Seeing /wp-includes/ appear in your crawl data can look unusual at first, especially if you are reviewing Search Console closely.
However, it is simply part of how Google renders modern websites.
Blocking these files rarely provides any benefit and can sometimes prevent Google from understanding your pages correctly.
If your crawl stats show Google fetching JavaScript files from WordPress core directories, that is usually a sign that Google is successfully processing your site.
And in most cases, the best SEO decision is simply to let it.