How to Fix the "Blocked Due to Access Forbidden (403)" Error

How to Fix the

Free Complete Site Audit

Access a full website audit with over 300 technical insights.

Something went wrong. Please, try again later.
Trusted by
Sitechecker trusted company

Free Website SEO Checker & Audit Tool

  • Scan the site for 300+ technical issues
  • Monitor your site health 24/7
  • Track website rankings in any geo

Free SEO Chrome Extension

Automate your on-page SEO routine

  • On Page SEO
  • Content optimization
  • Links
  • GSC Insights
Chrome Add to Chrome
rating 4.5
logo-reviews
182K

What does “Blocked due to access forbidden (403)” mean?

When you see “Blocked due to access forbidden (403)” in Google Search Console, it means Googlebot tried to crawl a page, but your server refused access. Instead of serving the content, the server returned a 403 status code, telling Google that the bot could not view the page.

GSC blocked due to access forbidden 403

Common causes of 403 errors in Google Search Console

  • Firewall rules are blocking Googlebot requests by mistake.
  • IP restrictions that accidentally deny Google’s crawling IPs.
  • Cloudflare security settings, like Bot Fight Mode or strict WAF rules.
  • Shopify apps or custom settings are interfering with crawler access.
  • CMS plugins (especially security-focused) misidentify Googlebot as a threat.
  • Incorrect file or directory permissions are preventing bots from viewing your pages.

Is Googlebot Blocked? Find Out Now

Detect and fix 403 errors before they harm your SEO – quick, free scan.

Something went wrong. Please, try again later.

How to troubleshoot and fix 403 errors

Fixing a 403 error starts with understanding why your server is denying access. Follow these practical steps to identify the root cause and ensure Googlebot can crawl your site again.

1. Check server logs for 403 responses

Start with your server logs. Look for requests from Googlebot (usually user-agent Googlebot) that return a 403 status code. This helps you confirm whether Google was blocked and which pages triggered the error.

Where to find server logs

a) cPanel hosting (e.g, Hostinger, Bluehost, SiteGround)

Log in to your cPanel, go to Metrics → Raw Access or Errors, download the logs, and search for lines containing 403 and Googlebot.

cPanel hosting raw access

b) Cloudflare

Cloudflare’s free plan doesn’t provide full access logs.

Use Logpush to export logs to services like AWS S3 or BigQuery on Enterprise plans.

Create a Logpush Job for AWS S3:


curl -X POST "https://api.cloudflare.com/client/v4/zones/YOUR_ZONE_ID/logpush/jobs" \
-H "X-Auth-Email: your-email@example.com" \
-H "X-Auth-Key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
--data '{
"name": "logpush-to-s3",
"destination_conf": "s3://your-bucket-name/path?region=us-east-1&access-key-id=YOUR_AWS_KEY&secret-access-key=YOUR_AWS_SECRET",
"logpull_options": "fields=ClientIP,EdgeStartTimestamp,RequestHost,RequestURI×tamps=rfc3339",
"dataset": "http_requests",
"enabled": true
}'

Logpush Job for Google BigQuery:


curl -X POST "https://api.cloudflare.com/client/v4/zones/YOUR_ZONE_ID/logpush/jobs" \
-H "X-Auth-Email: your-email@example.com" \
-H "X-Auth-Key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
--data '{
"name": "logpush-to-bigquery",
"destination_conf": "bigquery://project-id:dataset-id?table=your_table&credentials=BASE64_ENCODED_JSON_KEY",
"logpull_options": "fields=ClientIP,EdgeStartTimestamp,RequestHost,RequestURI×tamps=rfc3339",
"dataset": "http_requests",
"enabled": true
}'

Otherwise, check the Security Events tab in your Cloudflare dashboard for blocked requests that may align with 403 errors.

Cloudflare security events

c) Shopify

Shopify doesn’t provide direct access to server logs. If you see 403 errors in Google Search Console, check installed apps or custom themes that might restrict access to specific bots.

d) VPS or dedicated server

  • Apache logs are usually at: /var/log/apache2/access.log
  • Nginx logs are usually at: /var/log/nginx/access.log

Use command-line tools to filter logs:


grep "403" access.log | grep "Googlebot"

2. Test accessibility with google’s IP addresses

When setting up Cloudflare Logpush, it’s crucial to verify that Cloudflare can reach your target destination, such as an AWS S3 bucket or a Google BigQuery table. To do this, Cloudflare performs a test to ensure your destination is accessible from its servers hosted on Google Cloud Platform (GCP) infrastructure.

This is called a “destination accessibility test,” and it confirms that:

  • The destination exists.
  • The provided credentials are valid.
  • Firewalls, bucket policies, or IAM permissions do not block Cloudflare.

Why is this important?

Before Logpush starts sending real-time log data, this test prevents misconfiguration issues such as:

  • 403 Access Denied
  • Could not connect to bucket
  • Permission denied
  • Or other silent failures where logs are never delivered.

It ensures your logging setup works as expected before live traffic starts flowing.

How to perform the test

✅ Step 1: Whitelist Google Cloud IP ranges

Since Cloudflare’s Logpush infrastructure runs on GCP, your firewall or access policy must allow incoming connections from GCP IP addresses.

You can find Google’s official IP ranges here: https://www.gstatic.com/ipranges/cloud.json

Tip: For AWS S3, this means adjusting your bucket policy. For Google BigQuery, it means allowing access from external GCP services.

3. Review firewall and security tools (Cloudflare, hosting)

Security tools often mistake Googlebot for bad traffic. Review your firewall settings, Cloudflare rules, and any server security tools. Look for rules that block user agents, IP addresses, or entire countries. Make sure Googlebot is explicitly allowed through your firewall and security filters.

For example:

  1. Сheck if Bot Fight Mode or WAF rules are blocking automated crawlers in Cloudflare.
  2. In hosting panels like cPanel or Plesk, review ModSecurity logs for any blocks against Googlebot.
  3. Security plugins like Wordfence or iThemes Security might have strict bot-blocking settings on WordPress. Ensure Googlebot’s IPs and user agents are explicitly allowed through your firewall and bot protection layers.
Wordfence security plugin wp

4. Use robots.txt testing tool in GSC

Open the robots.txt Tester inside Google Search Console. Check if your robots.txt file accidentally blocks important pages or entire folders.

GSC robots tester

For example:

  • Look for lines like Disallow: / which block everything.
  • Check if sensitive folders like /admin/ or /private/ are wrongly blocking crawler access to public content. Fix any mistakes by editing your robots.txt file and testing again before publishing the changes.

5. Temporarily disable security features for testing

If you can’t find the block, temporarily disable security features like Web Application Firewalls (WAF), bot protection, or plugins.

For example:

  • In Cloudflare, disable Bot Fight Mode: Dashboard → Security → Bots → Turn off “Bot Fight Mode”.
  • In Wordfence (WordPress plugin), switch to “Learning Mode”:

<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>

If using Apache ModSecurity, you can disable it for testing in .htaccess:


# Comment out security rules
# include /etc/nginx/modsecurity/modsecurity.conf;
Important: Always re-enable your security settings after testing to protect your site.

6. Contact the hosting provider/service support if needed

If you’re stuck, contact your hosting provider or service support team. They can check server configurations, security layers, and access control settings that might block Googlebot.

When contacting support, provide:

  • A link to the specific page showing the 403 error.
  • A screenshot or export from Google Search Console.
  • The exact user-agent (Googlebot) and IP address details, if possible.

A short description like:

Google Search Console reports a 'Blocked due to access forbidden (403)' error for my page (URL). I suspect server settings or firewall rules might be blocking Googlebot. Could you please check?

Clear information helps the support team fix the problem faster.

7. Use the fetch as Google tool to validate access

After making changes, use the URL Inspection Tool in Google Search Console (formerly called “Fetch as Google”).

Enter the affected URL and click “Test Live URL”:

GSC test live url

If the page is accessible and no 403 error appears, Googlebot can crawl it correctly.

Click “Request Indexing” to ask Google to recrawl the page faster.

Track 403 errors automatically with Sitechecker

You can easily monitor and detect 403 errors using Sitechecker. In the audit report, look for 4xx Client Errors. This section shows you all blocked pages, including those with a 403 Forbidden error.

Sitechecker 4xx client error

Click “View issue” next to the error to get a complete list of affected pages and detailed recommendations.

Sitechecker 4xx client error data

403 errors often appear after server updates, new plugin installations, or changes to firewall settings, and many site owners don’t notice them until their SEO is already hurt.

For ongoing protection, set up WordPress Updates Monitoring in Sitechecker. It automatically watches for new access issues after installing new apps, plugins, or server changes.

Sitechecker wp alerts

We keep track of changes in your WordPress version, theme, or plugins and alert you instantly via email or Slack when updates are detected.

Sitechecker wordpress version changed notification
Sitechecker wordpress theme updated notification

You’ll catch problems like 403 errors before they hurt your site’s SEO or block Googlebot.

How to verify that the 403 error is resolved

Once you’ve made changes, it’s time to double-check everything. Open the URL Inspection Tool in Google Search Console, enter the affected URL, and click “Test Live URL”.

GSC url is not on google

If the test shows the page is accessible and no 403 error appears, the issue is fixed.

After that, click “Request Indexing” to speed up Google’s re-crawl and get the page back into the index faster.

GSC request indexing

After fixing the issue, you’ll often see a “Validate Fix” button next to the error in Google Search Console.

Click it to tell Google you’ve solved the problem.

GSC validate fix

Google will start rechecking your pages, and if everything looks good, the 403 error will be cleared from your report within a few days.

Conclusion

A 403 error in Google Search Console means Googlebot is blocked from accessing your site. To fix it, review server logs, firewall rules, robots.txt settings, and test accessibility. Sitechecker can automatically detect and monitor 403 errors and get real-time alerts via email or Slack. After fixing the issue, verify the fix in Google Search Console using the “Test Live URL” and “Validate Fix” options to ensure your pages are crawlable again and recover your SEO performance quickly.

FAQ
Yes. Even short-lived 403 errors can cause crawling delays, missed indexing opportunities, and temporarily reduce your site's visibility in search results if Googlebot encounters repeated blocks during its crawl attempts.
Yes. If Googlebot repeatedly encounters a 403, it may assume the page is permanently inaccessible and slow down or stop crawling that URL until a successful fetch is confirmed.
Not directly. However, misconfigured security settings, automated bot protections, or third-party app conflicts on platforms like Cloudflare or Shopify can accidentally cause 403 errors against legitimate bots like Googlebot.
No. Validation triggers Google to recheck the page, but it may take several days for Google to process the request and clear the error if the page is now accessible.
Always monitor changes in real-time using tools like Sitechecker, test site accessibility manually after major updates, and keep a changelog of installed apps, firewall rules, and hosting adjustments to trace any new issues quickly.
Fast Links

You may also like

View More Posts
What is
Google Search Console
What is "Validate Fix" in Google Search Console?
Roman Rohoza
Mar 25, 2025
How to Fix
Google Search Console
How to Fix "URL Not in the Property" Error
Roman Rohoza
Apr 28, 2025
How to Fix 'Submitted URL Marked Noindex' Issue
Google Search Console
How to Fix 'Submitted URL Marked Noindex' Issue
Roman Rohoza
Apr 16, 2025

So, soon? Well, before you go…

Get instant on-page SEO analysis of your home page

  • Detect broken links
  • Detect issues with content optimization
  • Check PageSpeed for mobile and desktop
Something went wrong. Please, try again later.
You’ll get the report in 2 seconds without required signup

So, soon? Well, before you go...

Try Sitechecker’s free Chrome extension to automate your on-page SEO routine

  • Get technical SEO overview
  • Check all links on the page
  • Detect issues with content
  • Get GSC metrics
Add to Chrome Chrome Add to Chrome
You can use extension even without creating account at Sitechecker
exit-popup-image exit-popup-image
close