I Built an npm Package and Tracked Every Download for Two Weeks. Here's the Data.
Briefly

I Built an npm Package and Tracked Every Download for Two Weeks. Here's the Data.
"Week 1 total: 977 downloads. Sounds decent until you realize most of it was automated. The 214 on publish day and 455 on the version bump day were mostly npm mirrors and registry crawlers, not real users. The real organic baseline for week 1? About 71 downloads per weekday (looking at the days without publish events)."
"Week 2 told the truth. Week 2 total: 63 downloads. A 94% drop from week 1. The organic weekday average settled at 16 downloads/day. For context, the established competitor (text-readability) does ~2,100 downloads/day. I'm at 0.7% of their volume."
"Echo JS was the #1 traffic source - 18 unique visitors to the GitHub repo, and all 5 new stars (1 → 6) came during the Echo JS traffic burst on March 10-12. Once Echo JS traffic faded, starring stopped completely. Google organic search appeared - 6 unique visitors found the repo through Google."
textlens is a Node.js package providing readability scoring, sentiment analysis, keyword extraction, and SEO scoring without dependencies or API keys. The creator published v1.0.0 on March 4 and tracked real performance metrics over two weeks. Week one showed 977 downloads, but most came from automated registry crawlers and mirrors, with organic baseline around 71 downloads daily. Week two revealed the true market demand: 63 total downloads and 16 daily organic downloads, representing a 94% decline. This represents 0.7% of established competitor text-readability's 2,100 daily downloads. Echo JS drove the primary traffic spike with 18 unique visitors and five GitHub stars. Google organic search provided six visitors, indicating promising long-term potential. Dev.to articles failed to convert traffic.
Read at DEV Community
Unable to calculate read time
[
|
]