Danitsunami
banner
danitsunami.bsky.social
Danitsunami
@danitsunami.bsky.social
Latino 🇲🇽 | An SEO dude, doing SEO things
For added context, each brand will have a partner page that will link between their sister brands.
January 13, 2026 at 11:57 PM
The business wants to let users know that each brand is related and they are getting the same level of service standards set by the business that acquired them.
January 13, 2026 at 11:54 PM
A bit of a follow up to this query. Is it necessary to have the same 'not found' content/messaging that is found within the CSR version to the SSR version or not really?
January 12, 2026 at 8:15 PM
Thanks for confirming! I agree on having one version and I'm currently advocating for it but it seems like a huge project within it self to get there, like you mentioned. The good news is that the approach is not ruled out, so there still a chance.
January 12, 2026 at 8:07 PM
Also note, the 200 status code for the CSR version has the meta robots directive as 'noindex' to follow what is recommended here: developers.google.com/search/docs/...
Understand JavaScript SEO Basics | Google Search Central  |  Documentation  |  Google for Developers
Discover how Google Search processes JavaScript and explore best practices for improving JavaScript web apps for Google Search.
developers.google.com
January 12, 2026 at 7:54 PM
Thanks for confirming!
November 11, 2025 at 11:05 PM
Awesome, thanks for the feedback!
September 24, 2025 at 8:03 PM
It's an odd setup, for sure... the content will be the same (besides the "you are being redirected" messaging) but the content for both versions are relatively the same. I'm leaning against the approach but I wanted to get your POV.
July 29, 2025 at 10:44 PM
Thank you for your feedback!
July 16, 2025 at 6:51 PM
Yeah, I looked into that too but GSC doesn't provide where it's being referred from internally, which makes me believe that they are being picked up from an another domain. I tired to run a 3rd party tool to see if there are any backlinks coming from any domains and still no luck.
February 17, 2025 at 6:25 PM
Thanks for the feedback, I'll try these disallows to see if it has any effect. Also, despite the these URLs being long, I'm seeing thousands of them being discovered within the past couple of weeks. I'm hoping to fix this issue before it becomes a bigger issues down the line.
February 17, 2025 at 5:43 PM
Apologies for that, those pictures came out really grainy, especially when you view them on a mobile device. The dark screenshot is showing the disallow should work in the robots.txt. The 2nd one shows GSC fetching the robots.txt. The 3rd shows that the related URLs are still being crawled.
February 17, 2025 at 4:06 PM
@johnmu.com thanks for the feedback! I wasn't able to find a tester within GSC. However, there is a robots.txt report under settings. I did use an alternative tester, technicalseo.com/tools/robots.... It shows that the targeted pages should be blocked. Below are results from the tester & GSC reports
February 14, 2025 at 11:12 PM
For added context, the meta robots is setup as 'noindex, follow'
February 14, 2025 at 10:40 PM
As I dug deeper, I'm noticing that a handful of these pages that should be blocked are being blocked via robots.txt, the only difference is that they do not have the meta robots 'noindex' directive in place.
February 14, 2025 at 10:28 PM