/packs was blocked by robots.txt, which prevented Googlebot from fetching CSS when indexing pages, which made Google penalize pages for being mobile unfriendly because it couldn't load the CSS and it thought the layout was broken.
* Generate /robots.txt dynamically. * Include link to sitemap. * Update list of allowed urls. * Disallow crawling of non-canonical subdomains.