What framework or service are you using to pre-render your content? Check out https://nuxt.com and https://prerender.io if you’re not using something like this already.
Source:
over 1 year ago
The best option is going to be using SSR using Next.js/Vite SSR/similar as others have mentioned. If you do want to stick to an SPA though (vanilla React + Vite/CRA), make sure your meta tags are set dynamically, and you can definitely pre-render (using prerender.io for example) as well.
Source:
over 1 year ago
If you don’t go with Next, you’ll want to make sure that you’re properly setting all your page titles, meta descriptions, and tags with something like react-helmet (or whatever the newer fork of it is called) and prerendering with prerender.io or something.
Source:
over 1 year ago
Thank you for the comment. I’ll investigate prerender.io. I think we’ll most likely change the architecture, but if we continued the developers recommended next.js.
Source:
over 1 year ago
Depending on how many pages you have, that can get expensive. You can get around the cost by implementing prerender.io as a stopgap (to start getting your pages indexed again — this can take precious time) and then work your way towards a node instance that handles the static rendering for you. There are lots of tutorials on this, but they depend on which instance of React you’re working in.
Source:
over 1 year ago
How can I get SEMRush to acknowledge my tags after render – or perhaps point to prerender.io cache pages?
Source:
over 1 year ago
These days google cares more about how much resources it spends to crawl a page. So server-side solutions can potentially be worse than client-rendering if the server is too slow. Basically things like prerender.io are generally worse than the google crawler.
Source:
over 1 year ago
At my last company, we pushed for prerendering because they decided to use Angular.JS as the codebase and Google despite what they said about it did not like it. We wanted SSR, but the best we could get is prerender.io. I can tell you that it did help a slight bit, but not at all what everyone made it out to do.
Source:
over 1 year ago
The benefit here is that you’ve now got a pre-rendered (or server rendered) landing page. If it’s a marketing page, that means better performance for SEO purposes. As it stands with CRA, you have no options. You can only client render things and use prerender.io (which, tbf, covers this scenario, but if you have a goal to pre-render or server render more than 10k pages, prerender simply won’t keep up.
Source:
over 1 year ago
You can use something like prerender.io for this either cloud or self-hosted.
Source:
over 1 year ago
Try looking in your google search console at how and if google renders your pages. Maybe there is some setup issues regarding prerender.io.
Source:
over 1 year ago
I have setup cloudflare workers to point bot traffic to prerender.io enabling bots to see pre-rendered versions of my site instantly without needing to load JS / Blazor.
I’ve also forwarded /sitemap.xml to my aspnet core server to get an always up to date sitemap.
Though google / bing search results are still awful.
What other tricks do you guys use to improve a blazor sites SEO?
Source:
over 1 year ago
P.S. Contacted the supports of prerender.io but the logs they are providing are just a spreadsheet with several columns: date of Googlebot visit, status code, URL. And that is not the best data to put into any SEO log analysis tool.
Source:
almost 2 years ago
SPAs are reliant on JavaScript to render content, but not all search engines execute JavaScript during crawling, and they may see empty content on your page. This inadvertently hurts the Search Engine Optimization (SEO) of your app. 2. However, most of the time, when you are building apps, SEO is not the most important factor, as not all the content needs to be indexable by search engines. To overcome this, you…
– Source: dev.to
/
almost 2 years ago
Yeah for sure – prerender.io requires a lot of testing, but if you don’t have the capacity to implement SSR, then this can be an alternative.
Source:
almost 2 years ago
As an alternate solution, you could consider doing dynamic rendering – there are applications such as prerender.io that will essentially create a static site and serve that to Googlebot. Worth noting Google have now changed their advice and consider it a “workaround” rather than a long term solution.
Source:
almost 2 years ago
An interesting option, does it strip script and style tags?
Because prerender.io on the cloud does and so you get a styleless page, and its very bad for Googlebot crawling.
Source:
about 2 years ago
You don’t have to maintain anything, I set up prerender.io two months ago and forgot about it since. Same goes for other solutions like Rendertron. They just re-prerender your pages every, say, 6 hours or so (in the case of prerender.io this short interval will cost money).
If I would have the time and will to go on with this research, I would create a Rendertron docker and use it instead (there are a few of…
Source:
about 2 years ago
What you would have to do is to write a function that redirects requests from crawlers to prerender.io‘.
Source:
about 2 years ago
I had a look at prerender.io. But, they have a guide on Node.js whereas I am using React. I don’t have a node server.
Source:
about 2 years ago
You can still improve SEO with SPA by using prerender.io or host your own. You pretty much serve that only on bots.
Source:
about 2 years ago