StrategyNewMarch 19, 2026

How to Turn Amazon Reviews Into Product Insights (Step-by-Step With Claude)

Lucrivo
Amazon seller intelligence and strategy
How to Turn Amazon Reviews Into Product Insights (Step-by-Step With Claude)

Amazon reviews are some of the best data you can get for an Amazon business — straight from shoppers, unfiltered, and full of product and listing clues. Yet getting reviews at scale, especially for competitors, keeps getting harder. Even though the text is publicly visible on the site, Amazon has tightened how that data flows to third-party tools and scrapers. Many SaaS products that once pulled competitor reviews systematically now hit walls: pagination limits, anti-bot measures, or policies that make “grab every review for this ASIN” unreliable.

The honest constraint: if you’re trying to pull competitor reviews systematically through tools or scripts, you often hit hard limits — in many setups you can’t reliably get much beyond a handful of reviews per product (think single digits) before pagination, blocks, or policy friction wins. That’s frustrating — but it doesn’t mean you can’t get high-quality insights. You can still work manually in a focused, scrappy way: copy a large chunk straight from the browser, then let AI do the analysis that used to take hours.

This guide is a step-by-step path from raw review text to trends, sentiment, and concrete action items — using copy-paste and Claude, no code required.


Why bother when reviews are “hard to get”?

  • Your own reviews — Many seller-focused tools still help you monitor and export your reviews. Use those for ongoing voice-of-customer work.
  • Competitor reviews — For ASINs you don’t own, systematic bulk export is often the broken piece. What still works: open the listing, scroll the reviews section, and grab a big selection in one go (see Step 2). Often that’s several screen-heights of content — roughly on the order of five to seven “pages” of reviews if you’re thorough. It takes a few minutes per product, not days.
  • The real bottleneck isn’t collection — it’s analysis. Reading hundreds of lines, tagging themes, counting sentiment, and turning that into “what should we change on the product or listing?” is slow and error-prone for humans. That’s where Claude shines — it can ignore UI noise and focus on what shoppers actually said.

There’s a reason platforms don’t want review corpora trivially scrapable: aggregated review signal is strategically valuable. Your workaround isn’t a perfect pipeline — it’s high-signal manual sampling plus AI synthesis. Used consistently, that can still beat guessing.


Step-by-step: From reviews to insights

Step 1: Pick your goal and ASINs

Be specific before you copy anything:

  • Competitor research — 1–3 top competitor ASINs in your niche (similar price band and use case).
  • Your own SKU — Your reviews plus 1–2 competitors for contrast.
  • Variant check — Same brand, different size or color, to see if complaints differ.

Write down: What decision am I feeding? (e.g. “next bundle,” “listing refresh,” “warranty copy,” “packaging change.”)

Step 2: Pull review text (the scrappy way)

Don’t cherry-pick individual sentences. Select and copy the whole visible review block — section title (e.g. “Top reviews from the United States”), reviewer names, star rows, dates, “Verified Purchase,” variant lines (color/size), the full review bodies, even stray UI text like “Helpful” counts. It looks messy in your clipboard; that’s fine. Claude can read through the noise and still extract themes, sentiment, and quotes.

For each ASIN:

  1. Open the product’s review section on Amazon.
  2. Sort by most recent and, separately, by most helpful — two passes if you can, since they surface different patterns.
  3. Optional: do one pass focused on low stars (e.g. 1–3) and another on high stars (4–5) if you want more targeted blobs — still copy the whole visible chunk each time, not line-by-line.
  4. Click and drag to highlight a large contiguous area of the reviews (multiple reviews at once), then copy. Repeat until you’ve captured enough volume — you’re trading perfect neatness for speed.
  5. Paste into a scratch doc (Notes, Google Doc, or straight into Claude) with a header like ## ASIN B0XXXX — Competitor A — Low-star pass so you remember what you grabbed.

Tip: Aim for volume and diversity over perfect completeness. A few minutes of broad copying beats zero structured analysis.

Step 3: Start a Claude session (or Project)

  • For one-off research, a normal chat is fine.
  • For ongoing competitor tracking, create a Claude Project (e.g. “Amazon Niche — Review Intel”) and upload your positioning one-pager, your current listing bullets, and a short note on what your product actually does versus what the listing says.

That way Claude can flag cases like: customer says X is missing — but you already do X; it’s just not visible in the listing or images.

Step 4: Paste with structure

Use clear XML-style delimiters so Claude doesn’t mix ASINs. Paste your entire copied blob (including headers, names, stars, and UI junk) between the opening and closing tags — no need to clean it up first.

Example:

<reviews asin="B0XXXXXXXX" focus="1-3 stars" sort="recent">
[Paste the whole copied review area here — messy text is OK]
</reviews>

<reviews asin="B0YYYYYYYY" focus="1-3 stars" sort="recent">
[Paste the whole copied review area here]
</reviews>

<context>
We sell [brief product description]. Main competitors above.
</context>

Step 5: Run the analysis prompt

Use the prompt below (adjust context in the tags above as needed). Use the Copy button to grab it in one click — no manual selection required.

Analysis prompt
<task>
You are an Amazon seller strategist. Analyze the reviews in the tags above.

1. **Themes** — List the top 8–12 recurring themes (quality, sizing, packaging, instructions, durability, smell, customer service, shipping damage, etc.). For each theme, note whether it appears mostly in low-star, high-star, or both.

2. **Sentiment** — Summarize overall sentiment per ASIN (not star average — what people *feel* about: value, trust, expectations vs reality).

3. **Gaps vs our product** — If context describes our product: For each major competitor complaint, state whether our product likely addresses it (yes/no/unclear). If we address it but reviewers “don’t know,” flag as a **listing or image gap**.

4. **Action items** — Give 10–15 bullet action items split into:
   - **Product / ops** (change the physical offer, bundle, instructions, QC)
   - **Listing** (bullets, A+, FAQ, comparison chart, warranty language)
   - **Creative** (image or video that proves a claim)

5. **Quick wins** — The 3 changes likely to move conversion or reduce returns in the next 30 days.

Output in markdown with clear headings. Be specific; quote short phrases from reviews when useful.
</task>

Step 6: While Claude works, start the next product

Parallelize the boring part: when Claude is generating the first report, open another tab and copy a big block of reviews for a second ASIN. By the time you’re done pasting, you often have a first draft to refine.

Step 7: Sanity-check and implement

  • Spot-check a few claims against the raw text — models can over-generalize.
  • Prioritize listing and image fixes that don’t require a new mold (often the highest ROI).
  • For product changes, tie each item to return reasons or repeat phrases in reviews, not one-offs.

The “listing gap” insight (why this matters)

A common win: reviewers say “it doesn’t come with X” or “no way to do Y” — and your SKU already includes X or does Y, but it’s buried in text or missing from the main image stack. Review mining surfaces messaging failures, not just product failures. Claude is especially good at contrasting verbatim customer language with your stated benefits.


How this connects to the rest of your stack


Bottom line

Amazon isn’t making competitor review data easy for tools — that doesn’t block insight; it blocks lazy automation. Copying several pages’ worth of reviews per ASIN is minutes of work; making sense of it used to be the multi-hour grind. Hand that grind to Claude: trends, sentiment, and actionable product and listing moves. It’s scrappy analytics, but scrappy plus consistent often beats perfect data you never analyze.


The Lucrivo Newsletter — Coming Soon! Please check out our content on our website for now — explore the blog, tools, and automations roadmap.

Affiliate Disclosure: Some links on this page may be affiliate links. If you purchase through them, we may earn a commission at no extra cost to you. We only recommend products and services we genuinely believe will add value to Amazon sellers.