Last year I had the privilege of working on an infinite scroll AJAX website. There as many issues that many people do not understand or know about with infinite scroll SEO. This was the fix that I can up with.
This issue is with Googlebot getting to deeper pages on infinite scroll. The steps that you have to do are as follows:
1) Change the current prev, next structure to a rel=prev, rel=next and to render page numbers in between. Example Prev, 1, 2, 3, 4, 5, (up to 100 or however many pages you have) & Next. You will do this so that the spiders will see the next page with javasscript turned off because you are loading via an Ajax Load.
2) You have to change the canonical of page #2, #3 and so on
Doing so will send a signal that the page is different to the spiders and they will crawl the page.
3) To prevent many spammy looking duplicate pages from rendering in the indexes you will want to incorporate the noindex, follow tag on each of the /?page=2/ pages so you do not have to worry about the page but more about the items on the page.
I know this sounds kinda crazy but sending a signal to Google that the next page is different (canonical) and giving them many pages using (prev & next) with help the spiders get to the deeper pages while reducing the chance that would would put out millions of dup pages using the robots tag (noindex, follow) will improve the ability for Google to crawl your site.
If you have any other issues or questions about Infinite Scroll SEO or Silicon Valley SEO feel free to reach out to me.