plugin-icon

Block old browser versions and suspicious browsers

제작자: hupe13·
With the help of WhatIsMyBrowser the plugin detects old and bad browsers and denies them access. A special robots.txt denies crawling by bad bots.
버전
1.0.1
최근 업데이트일
Dec 12, 2025

Every time your web browser makes a request to a website, it sends a HTTP Header called the “User Agent”. The User Agent string contains information about your web browser name, operating system, device type and lots of other useful bits of information.

The plugin sends with an API the User Agent string of every browser that accesses your website for the first time to https://api.whatismybrowser.com/api/v2/user_agent_parse to obtain following information about the User Agent:

  • Software Name & Version
  • Operating System Name & Version

WhatIsMyBrowser.com API Terms and Conditions

With this information, the plugin attempts to detect old and bad browsers and denies them access to your website.

HowTo

  • Go to What is my browser? and sign up to the WhatIsMyBrowser.com API for a Basic (free) Application Plan.
  • You have a limit of 5000 hits / month for Parsing User Agent. That’s why the plugin manages a database table.
  • The user agent string of every browser that accesses your website for the first time is sent to this service, and the information is stored a table.
  • Browsers are blocked if the browser and/or system are outdated:
    • Default: Chrome and Chrome based browsers < 128, Firefox < 128, Internet Explorer, Netscape (!), Opera < 83, Safari < 17
    • Old systems are all Windows versions prior to Windows 10, some MacOS and Android versions.
  • It will be blocked also if the “simple software string” contains “unknown” or is empty.
  • You can also set up other browsers.
  • Sometimes there are false positive, for example, if the browser is from Mastodon. In this case, you can exclude it from the check.
  • The plugin checks whether the crawlers really originate from Google, Bing, Yandex, Apple, Mojeek, Baidu, Seznam.

About robots.txt

  • You can configure some rewrite rules to provide a robots.txt file that can allow or deny crawling for a browser. If crawling is denied, access to your website will be blocked for that browser.

Logging

  • The logging can be very detailed. Please check the logs and the WIMB table regularly.
무료Business 요금제에서
설치하면 WordPress.com 서비스 약관서드파티 플러그인 약관에 동의하게 됩니다.
테스트된 버전
WordPress 6.9
이 플러그인은 다운로드할 수 있으며 에서 사용할 수 있습니다.