Powered by https://findmybug.com/free-website-health-check ()

health report : https://safeyourweb.in/

examined at : 24-11-09 09:08:00

follow recommendations of this health report to keep your site healthy

Website Health Report powered by https://findmybug.com/free-website-health-check
  Nov 9, 2024 09:08:00
19.1 / 100
Overall Score
78.39 / 100
Desktop Score
64.66 / 100
Mobile Score

Compare
Download Pdf
Please provide your information, a download link will be sent to you

Page Title

Page Title

SafeYourWeb - Be Fearless Online With Our Advanced AI Powered Security Solutions

Short Recommendation

Your page title exceeds 60 characters. It's not good.

Title is the heading of the webpage. The sentence or string enclosed between html title tag () is the title of your website. Search engines searches for the title of your website and displays title along with your website address on search result. Title is the most important element for both SEO and social sharing. Title should be less than 50 to 60 characters because search engine typically displays this length of string or sentence on search result. A good title can consist the primary keyword, secondary keyword and brand name. For example a fictitious gaming information providing sites title may be like "the future of gaming information is here". A webpage title should contain a proper glimpse of the website. title is important element as an identification of your website for user experience, SEO and social sharing. So have a nice and catching title.
Learn more

Meta Description

Meta Description

SafeYourWeb provides comprehensive cybersecurity solutions to protect businesses from cyber threats. Explore our AI-powered security audits, penetration testing,VoIP penetration testing, and training programs. Secure your digital future today!

Short Recommendation

Your meta description exceeds 150 characters. It's not good.

description_recommendation

Meta Keyword

Meta Keyword

Short Recommendation

Your site do not have any meta keyword.

Meta keywords are keywords inside Meta tags. Meta keywords are not likely to be used for search engine ranking. the words of title and description can be used as meta keywords. it is a good idea for SEO other than search engine ranking.

Keyword Analysis

Single Keywords

Keyword Occurrence Density Possible Spam
Security 10 1.764 % No
security 10 1.764 % No
data 6 1.058 % No
protect 6 1.058 % No
business 5 0.882 % No
services 5 0.882 % No
solutions 5 0.882 % No
identify 5 0.882 % No
businesses 5 0.882 % No
team 5 0.882 % No
SafeYourWeb 4 0.705 % No
Solutions 4 0.705 % No
online 4 0.705 % No
experts 4 0.705 % No
clients 4 0.705 % No
attacks 4 0.705 % No
AI 3 0.529 % No
Contact 3 0.529 % No
threats 3 0.529 % No
cybersecurity 3 0.529 % No

Two Word Keywords

Keyword Occurrence Density Possible Spam
your business 5 0.882 % No
We provide 5 0.882 % No
Security We 4 0.705 % No
Our team 4 0.705 % No
with our 4 0.705 % No
Application Security 3 0.529 % No
provide professional 3 0.529 % No
professional and 3 0.529 % No
and reliable 3 0.529 % No
security services 3 0.529 % No
services to 3 0.529 % No
to help 3 0.529 % No
help businesses 3 0.529 % No
businesses protect 3 0.529 % No
protect their 3 0.529 % No
their online 3 0.529 % No
online assets 3 0.529 % No
assets Our 3 0.529 % No
team of 3 0.529 % No
of skilled 3 0.529 % No

Three Word Keywords

Keyword Occurrence Density Possible Spam
Security We provide 3 0.529 % No
We provide professional 3 0.529 % No
provide professional and 3 0.529 % No
professional and reliable 3 0.529 % No
security services to 3 0.529 % No
services to help 3 0.529 % No
to help businesses 3 0.529 % No
help businesses protect 3 0.529 % No
businesses protect their 3 0.529 % No
protect their online 3 0.529 % No
their online assets 3 0.529 % No
online assets Our 3 0.529 % No
assets Our team 3 0.529 % No
Our team of 3 0.529 % No
team of skilled 3 0.529 % No
of skilled security 3 0.529 % No
skilled security experts 3 0.529 % No
security experts works 3 0.529 % No
experts works closely 3 0.529 % No
works closely with 3 0.529 % No

Four Word Keywords

Keyword Occurrence Density Possible Spam
Security We provide professional 3 0.529 % No
We provide professional and 3 0.529 % No
provide professional and reliable 3 0.529 % No
security services to help 3 0.529 % No
services to help businesses 3 0.529 % No
to help businesses protect 3 0.529 % No
help businesses protect their 3 0.529 % No
businesses protect their online 3 0.529 % No
protect their online assets 3 0.529 % No
their online assets Our 3 0.529 % No
online assets Our team 3 0.529 % No
assets Our team of 3 0.529 % No
Our team of skilled 3 0.529 % No
team of skilled security 3 0.529 % No
of skilled security experts 3 0.529 % No
skilled security experts works 3 0.529 % No
security experts works closely 3 0.529 % No
experts works closely with 3 0.529 % No
works closely with our 3 0.529 % No
closely with our clients 3 0.529 % No

Keyword Usage

Keyword Usage

Short Recommendation

The most using keywords do not match with meta keywords.

Keyword usage is the using of your keywords inside Meta tags and contents of your website. Use keywords that describes your site properly for precise search engine result of your website.

Sitemap

Short Recommendation

Your site does not have sitemap

Sitemap is a xml file which contain full list of your website urls. It is used to include directories of your websites for crawling and indexing for search engine and access for users. it can help search engine robots for indexing your website more fast and deeply. It is roughly an opposite of robots.txt You can create a sitemap.xml by various free and paid service or you can write it with proper way (read about how write a sitemap).

Also keep these things in mind:
1) Sitemap must be less than 10 MB (10,485,760 bytes) and can contain maximum 50,000 urls. if you have more uls than this create multiple sitemap files and use a sitemap index file.
2) Put your sitemap in website root directory and add the url of your sitemap in robots.txt.
3) sitemap.xml can be compressed using grip for faster loading.

Broken link: a broken link is an inaccessible link or url of a website. a higher rate of broken links have a negative effect on search engine ranking due to reduced link equity. it also has a bad impact on user experience. There are several reasons for broken link. All are listed below.
1) An incorrect link entered by you.
2) The destination website removed the linked web page given by you. (A common 404 error).
3) The destination website is irreversibly moved or not exists anymore. (Changing domain or site blocked or dysfunctional).
4) User may behind some firewall or alike software or security mechanism that is blocking the access to the destination website.
5) You have provided a link to a site that is blocked by firewall or alike software for outside access.
Learn more or Learn more

Total Words

Total Words

567

Unique words are uncommon words that reflects your site features and informations. Search engine metrics are not intended to use unique words as ranking factor but it is still useful to get a proper picture of your site contents. Using positive unique words like complete, perfect, shiny, is a good idea user experience.

Stop words are common words like all the preposition, some generic words like download, click me, offer, win etc. since most used keyword may be a slight factor for visitors you are encouraged to use more unique words and less stop words.

Text/HTML Ratio Test

Site passed text/HTML ratio test.

Text/HTML Ratio Test : 26%

The ideal page's ratio of text to HTML code must be lie between 20 to 60%. Because if it is come less than 20% it means you need to write more text in your web page while in case of more than 60% your page might be considered as spam.

HTML Headings

  • H1(3)
  • Be Fearless Online With Our Advanced AI Powered Security Solutions
  • See What Our Clients Love Most
  • SafeYour Web
  • H2(4)
  • Why Security Matters for Your Business
  • Why Cybersecurity is Essential
  • Why Choose SafeYourWeb
  • Ready to Take the Next Step?
  • H3(0)
  • H4(0)
  • H5(16)
  • Vulnerability Assessments
  • Threat Detection
  • Security Audits and Compliance
  • Web Application Security
  • Mobile Application Security
  • API Security
  • Ransomware
  • Phishing
  • Data Breaches
  • Malware Attacks
  • Top-notch Security
  • Expert Team
  • Proven Results
  • Customized Solutions
  • AI-Powered Solutions
  • 24/7 Support
  • H6(3)
  • Company
  • Services
  • Contact

h1 status is the existence of any content inside h1 tag. Although not important like Meta titles and descriptions for search engine ranking but still a good way to describe your contents in search engine result.

h2 status less important but should be used for proper understanding of your website for visitor.

robot.txt

Short Recommendation

Your site does not have robot.txt.



robots.txt is text file that reside on website root directory and contains the instruction for various robots (mainly search engine robots) for how to crawl and indexing your website for their webpage. robots.txt contains the search bots or others bots name, directory list allowed or disallowed to be indexing and crawling for bots, time delay for bots to crawl and indexing and even the sitemap url. A full access or a full restriction or customized access or restriction can be imposed through robots.txt.

robots.txt is very important for SEO. Your website directories will be crawled and indexed on search engine according to robots.txt instructions. So add a robots.txt file in your website root directory. Write it properly including your content enriched pages and other public pages and exclude any pages which contain sensitive information. Remember robots.txt instruction to restrict access to your sensitive information of your page is not formidable on web page security ground. So do not use it on security purpose.
Learn more

Internal Vs. External Links

Total Internal Links?

1

Total External Links?

0
  • Internal Links
  • /cdn-cgi/l/email-protection
  • External Links

Domain IP Information

IP

172.67.195.233

City

San Francisco

Country

US

Time Zone

America/Los_Angeles

Longitude

-122.3971

Latitude

37.7621

NoIndex , NoFollow, DoFollow Links

Total NoIndex Links

0

Total NoFollow Links

0

Total DoFollow Links

1

NoIndex Enabled by Meta Robot?

No

NoFollow Enabled by Meta Robot?

No
  • NoIndex Links
  • NoFollow Links

NoIndex : noindex directive is a meta tag value. noindex directive is for not to show your website on search engine results. You must not set ‘noindex’ as value in meta tags if you want to be your website on search engine result.

By default, a webpage is set to “index.” You should add a <meta name="robots" content="noindex" /> directive to a webpage in the <head> section of the HTML if you do not want search engines to crawl a given page and include it in the SERPs (Search Engine Results Pages).

DoFollow & NoFollow : nofollow directive is a meta tag value. Nofollow directive is for not to follow any links of your website by search engine bots. You must not set ‘nofollow’ as value in meta tags if you want follow your link by search engine bots.

By default, links are set to “follow.” You would set a link to “nofollow” in this way: <a href="http://www.example.com/" rel="nofollow">Anchor Text</a> if you want to suggest to Google that the hyperlink should not pass any link equity/SEO value to the link target.

Learn more

SEO Friendly Links

Short Recommendation

Links of your site are SEO friendly.


An SEO friendly link is roughly follows these rules. The url should contain dash as a separator, not to contain parameters and numbers and should be static urls.

To resolve this use these techniques.
1) Replace underscore or other separator by dash, clean url by deleting or replaceing number and parameters.
2) Marge your www and non www urls.
3) Do not use dynamic and related urls. Create an xml sitemap for proper indexing of search engine.
4) Block unfriendly and irrelevant links through robots.txt.
5) Endorse your canonical urls in canonical tag.
Learn more

Plain Text Email Test

Short Recommendation

Site failed plain text email test.2plain text email found.

  • Plain Text Email List
  • bootstrap@5.3.3
  • bootstrap@5.3.3

Plain text email address is vulnerable to email scrapping agents. An email scrapping agent crawls your website and collects every Email address which written in plain text. So existence of plain text email address in your website can help spammers in email Harvesting. This could be a bad sign for search engine.

To fight this you can obfuscate your email addresses in several ways:
1) CSS pseudo classes.
2) Writing backward your email address.
3) Turn of display using css.
4) Obfuscate your email address using javascript.
5) Using wordpress and php (wordpress site only).
Learn more

Favicon

Short Recommendation

Your site have favicon.

DOC Type

DOC Type : <!doctypehtml>
Short Recommendation

Page have doc type.

doc type is not SEO factor but it is checked for validating your web page. So set a doctype at your html page.
Learn more

Image 'alt' Test

Short Recommendation

Your site have 1 images without alt text.

  • Images Without alt
  • /static/data.png

An alternate title for image. Alt attribute content to describe an image. It is necessary for notifying search engine spider and improve actability to your website. So put a suitable title for your image at least those are your website content not including the images for designing your website. To resolve this put a suitable title in your alt attributes.
Learn more

Depreciated HTML Tag

Short Recommendation

Your site does not have any depreciated HTML tag.


Older HTML tags and attributes that have been superseded by other more functional or flexible alternatives (whether as HTML or as CSS ) are declared as deprecated in HTML4 by the W3C - the consortium that sets the HTML standards. Browsers should continue to support deprecated tags and attributes, but eventually these tags are likely to become obsolete and so future support cannot be guaranteed.

HTML Page Size

HTML Page Size : 16 KB
Short Recommendation

HTML page size is > 100KB

HTML page size is the one of the main factors of webpage loading time. It should be less than 100 KB according to google recommendation. Note that, this size not including external css, js or images files. So small page size less loading time.

To reduce your page size do this steps
1) Move all your css and js code to external file.
2) make sure your text content be on top of the page so that it can displayed before full page loading.
3) Reduce or compress all the image, flash media file etc. will be better if these files are less than 100 KB
Learn more

GZIP Compression

Short Recommendation

GZIP compression is disabled.

GZIP is a generic compressor that can be applied to any stream of bytes: under the hood it remembers some of the previously seen content and attempts to find and replace duplicate data fragments in an efficient way - for the curious, great low-level explanation of GZIP. However, in practice, GZIP performs best on text-based content, often achieving compression rates of as high as 70-90% for larger files, whereas running GZIP on assets that are already compressed via alternative algorithms (e.g. most image formats) yields little to no improvement. It is also recommended that, GZIP compressed size should be <=33 KB

Inline CSS

Short Recommendation

Your site have 6 inline css.

  • Inline CSS
  • <div class=container style=margin-top:3rem;margin-bottom:5rem></div>
  • <h1 class=res style=font-size:67px;font-family:Manrope,sans-serif;font-weight:600;padding-top:10vh;padding-bottom:2vh></h1>
  • <p class=lead style="padding:0 6vw;font-size:18px">
  • <span style=font-weight:600></span>
  • <span style=font-weight:600></span>
  • <span style=color:#0bac0b></span>

Inline css is the css code reside in html page under html tags not in external .css file. Inline css increases the loading time of your webpage which is an important search engine ranking factor. So try not to use inline css.

Internal CSS

Short Recommendation

Your site does not have any internal css.

Internal css is the css codes which resides on html page inside style tag. Internal css is increases loading time since no page caching is possible for internal css. Try to put your css code in external file.

Micro Data Schema Test

Short Recommendation

Site failed micro data schema test.

Micro data is the information underlying a html string or paragraph. Consider a string “Avatar”, it could refer a profile picture on forum, blog or social networking site or may it refer to a highly successful 3D movie. Microdot is used to specify the reference or underlying information about an html string. Microdata gives chances to search engine and other application for better understanding of your content and better display significantly on search result.
Learn more

IP & DNS Report

IPv4

172.67.195.233

IPv6

2606:4700:3033::ac43:c3e9
DNS Report
SL Host Class TTL Type PRI Target IP
1safeyourweb.inIN299A104.21.68.135
2safeyourweb.inIN299A172.67.195.233
3safeyourweb.inIN21600NSvicky.ns.cloudflare.com
4safeyourweb.inIN21600NSmegan.ns.cloudflare.com
5safeyourweb.inIN300MX10mail.safeyourweb.in
6safeyourweb.inIN299AAAA2606:4700:3035::6815:4487
7safeyourweb.inIN299AAAA2606:4700:3033::ac43:c3e9

IP Canonicalization Test

Short Recommendation

Site failed IP canonicalization test.

If multiple domain name is registered under single ip address the search bots can label other sites as duplicates of one sites. This is ip canonicalization. Little bit like url canonicalizaion. To solve this use redirects.
Learn more

URL Canonicalization Test

Short Recommendation

Site failed URL canonicalization test.

Canonical tags make your all urls those lead to a single address or webpage into a single url. Like :
<link rel="canonical" href="https://mywebsite.com/home" />
<link rel="canonical" href="https://www.mywebsite.com/home" />
Both refer to the link mywebsite.com/home. So all the different url with same content or page now comes under the link or url mywebsite.com/home. Which will boost up your search engine ranking by eliminating content duplication. Use canonical tag for all the same urls.
Learn more

cURL Response

  • url : https://safeyourweb.in/
  • content type : text/html; charset=utf-8
  • http code : 200
  • header size : 880
  • request size : 131
  • filetime : -1
  • ssl verify result : 20
  • redirect count : 0
  • total time : 0.07122
  • namelookup time : 0.026254
  • connect time : 0.033759
  • pretransfer time : 0.040923
  • size upload : 0
  • size download : 16085
  • speed download : 225849
  • speed upload : 0
  • download content length : -1
  • upload content length : 0
  • starttransfer time : 0.071108
  • redirect time : 0
  • redirect url :
  • primary ip : 2606:4700:3035::6815:4487
  • certinfo :
  • primary port : 443
  • local ip : 2a02:4780:11:1234::2f
  • local port : 48990
  • http version : 3
  • protocol : 2
  • ssl verifyresult : 0
  • scheme : HTTPS
  • appconnect time us : 74569
  • connect time us : 33759
  • namelookup time us : 26254
  • pretransfer time us : 40923
  • redirect time us : 0
  • starttransfer time us : 71108
  • total time us : 71220

PageSpeed Insights (Mobile)

Performance

  • Emulated Form Factor Mobile
  • Locale En-US
  • Category Performance
  • Field Data
  • First Contentful Paint (FCP)
  • FCP Metric Category
  • First Input Delay (FID)
  • FID Metric Category
  • Overall Category
  • Origin Summary
  • First Contentful Paint (FCP)
  • FCP Metric Category
  • First Input Delay (FID)
  • FID Metric Category
  • Overall Category
  • Lab Data
  • First Contentful Paint 3.2 s
  • First Meaningful Paint
  • Speed Index 3.2 s
  • First CPU Idle
  • Time to Interactive 3.3 s
  • Max Potential First Input Delay 20 ms

Audit Data

Resources Summary

Aggregates all network requests and groups them by typeLearn More

Eliminate render-blocking resources

Potential savings of 2,250 ms

Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles. Learn More

Efficiently encode images

Optimized images load faster and consume less cellular data. Learn More

Enable text compression

Text-based resources should be served with compression (gzip, deflate or brotli) to minimize total network bytes. Learn More

Serve static assets with an efficient cache policy

6 resources found

A long cache lifetime can speed up repeat visits to your page. Learn More

Minimize third-party usage

Third-party code blocked the main thread for 0 ms

Third-party code can significantly impact load performance. Limit the number of redundant third-party providers and try to load third-party code after your page has primarily finished loading. Learn More

Total Blocking Time

0 ms

Sum of all time periods between FCP and Time to Interactive, when task length exceeded 50ms, expressed in milliseconds.

JavaScript execution time

0.0 s

Consider reducing the time spent parsing, compiling, and executing JS. You may find delivering smaller JS payloads helps with this. Learn More

Defer offscreen images

Consider lazy-loading offscreen and hidden images after all critical resources have finished loading to lower time to interactive. Learn More

Server Backend Latencies

50 ms

Server latencies can impact web performance. If the server latency of an origin is high, it's an indication the server is overloaded or has poor backend performance. Learn More

Properly size images

Potential savings of 68 KiB

Serve images that are appropriately-sized to save cellular data and improve load time. Learn More

Reduce unused CSS

Potential savings of 25 KiB

Reduce unused rules from stylesheets and defer CSS not used for above-the-fold content to decrease bytes consumed by network activity. Learn More

Avoids enormous network payloads

Total size was 463 KiB

Large network payloads cost users real money and are highly correlated with long load times. Learn More

Minimizes main-thread work

0.8 s

Consider reducing the time spent parsing, compiling and executing JS. You may find delivering smaller JS payloads helps with this. Learn More

Avoid chaining critical requests

3 chains found

The Critical Request Chains below show you what resources are loaded with a high priority. Consider reducing the length of chains, reducing the download size of resources, or deferring the download of unnecessary resources to improve page load. Learn More

Avoids an excessive DOM size

220 elements

A large DOM will increase memory usage, cause longer Learn More

Avoid multiple page redirects

Redirects introduce additional delays before the page can be loaded. Learn More

Minify JavaScript

Minifying JavaScript files can reduce payload sizes and script parse time. Learn More

User Timing marks and measures

Consider instrumenting your app with the User Timing API to measure your app's real-world performance during key user experiences. Learn More

Network Round Trip Times

0 ms

Network round trip times (RTT) have a large impact on performance. If the RTT to an origin is high, it's an indication that servers closer to the user could improve performance. Learn More

PageSpeed Insights (Desktop)

Performance

  • Emulated Form Factor Desktop
  • Locale En-US
  • Category Performance
  • Field Data
  • First Contentful Paint (FCP)
  • FCP Metric Category
  • First Input Delay (FID)
  • FID Metric Category
  • Overall Category
  • Origin Summary
  • First Contentful Paint (FCP)
  • FCP Metric Category
  • First Input Delay (FID)
  • FID Metric Category
  • Overall Category
  • Lab Data
  • First Contentful Paint 0.8 s
  • First Meaningful Paint
  • Speed Index 1.0 s
  • First CPU Idle
  • Time to Interactive 0.8 s
  • Max Potential First Input Delay 20 ms

Audit Data

Resources Summary

Aggregates all network requests and groups them by typeLearn More

Eliminate render-blocking resources

Potential savings of 520 ms

Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles. Learn More

Efficiently encode images

Optimized images load faster and consume less cellular data. Learn More

Enable text compression

Text-based resources should be served with compression (gzip, deflate or brotli) to minimize total network bytes. Learn More

Serve static assets with an efficient cache policy

6 resources found

A long cache lifetime can speed up repeat visits to your page. Learn More

Minimize third-party usage

Third-party code blocked the main thread for 0 ms

Third-party code can significantly impact load performance. Limit the number of redundant third-party providers and try to load third-party code after your page has primarily finished loading. Learn More

Total Blocking Time

0 ms

Sum of all time periods between FCP and Time to Interactive, when task length exceeded 50ms, expressed in milliseconds.

JavaScript execution time

0.0 s

Consider reducing the time spent parsing, compiling, and executing JS. You may find delivering smaller JS payloads helps with this. Learn More

Defer offscreen images

Consider lazy-loading offscreen and hidden images after all critical resources have finished loading to lower time to interactive. Learn More

Server Backend Latencies

0 ms

Server latencies can impact web performance. If the server latency of an origin is high, it's an indication the server is overloaded or has poor backend performance. Learn More

Properly size images

Potential savings of 76 KiB

Serve images that are appropriately-sized to save cellular data and improve load time. Learn More

Reduce unused CSS

Potential savings of 25 KiB

Reduce unused rules from stylesheets and defer CSS not used for above-the-fold content to decrease bytes consumed by network activity. Learn More

Avoids enormous network payloads

Total size was 462 KiB

Large network payloads cost users real money and are highly correlated with long load times. Learn More

Minimizes main-thread work

0.2 s

Consider reducing the time spent parsing, compiling and executing JS. You may find delivering smaller JS payloads helps with this. Learn More

Avoid chaining critical requests

3 chains found

The Critical Request Chains below show you what resources are loaded with a high priority. Consider reducing the length of chains, reducing the download size of resources, or deferring the download of unnecessary resources to improve page load. Learn More

Avoids an excessive DOM size

220 elements

A large DOM will increase memory usage, cause longer Learn More

Avoid multiple page redirects

Redirects introduce additional delays before the page can be loaded. Learn More

Minify JavaScript

Minifying JavaScript files can reduce payload sizes and script parse time. Learn More

User Timing marks and measures

Consider instrumenting your app with the User Timing API to measure your app's real-world performance during key user experiences. Learn More

Network Round Trip Times

0 ms

Network round trip times (RTT) have a large impact on performance. If the RTT to an origin is high, it's an indication that servers closer to the user could improve performance. Learn More