Slashdot Asks: What's the Worst Review You Ever Saw on Amazon? (slashdot.org) 176

Long-time Slashdot reader theodp shared his story about the worst tech book review he found on Amazon in 2019. Stephen Few is a respected author and speaker whose books on data visualization and analysis are well-received. But when it comes to Amazon reviews, you simply can't make everyone happy, a particularly good example of which is a one-star review he received for The Data Loom: Weaving Understanding by Thinking Critically and Scientifically with Data.

So, what is it that the reviewer didn't like about Few's latest book? "THIS IS NOT A BOOK ON WEAVING TECHNIQUES," complains P. Dennis in her 1-star review, "Was not paying attention, I guess. Very disappointed."

Amazon shows potential buyers that 5 people found Ms. Dennis's 1-star review helpful, while hiding 6 comments that complain about Amazon's allowance of the 'ridiculous' review [including two from the frustrated author, who asks, "Would you give J. D. Salinger's book 'The Catcher in the Rye' a 1-star review because it is not about baseball?"].

And that kids, can be the difference between a 4 and a 5 rating on Amazon if your book is lightly-reviewed!

I still remember when Amazon shared their own favorite fake customer reviews, posting on the front page of Amazon in big orange letters, "You guys are really funny," and adding that "occasionally customer creativity goes off the charts in the best possible way."

But sometimes their reviewers are just stupid.

Leave your own favorite examples in the comments.What's the worst review you ever saw on Amazon?

Ask Slashdot: What Will the 2020s Bring Us?207

dryriver writes: The 2010s were not necessarily the greatest decade to live through. AAA computer games were not only DRM'd and internet tethered to death but became increasingly formulaic and pay-to-win driven, and poor quality console ports pissed off PC gamers. Forced software subscriptions for major software products you could previously buy became a thing. Personal privacy went out the window in ways too numerous to list, with lawmakers failing on many levels to regulate the tech, data-mining and internet advertising companies in any meaningful way. Severe security vulnerabilities were found in hundreds of different tech products, from Intel CPUs to baby monitors and internet-connected doorbells. Thousands of tech products shipped with microphones, cameras, and internet connectivity integration that couldn't be switched off with an actual hardware switch. Many electronics products became harder or impossible to repair yourself. Printed manuals coming with tech products became almost non-existent. Hackers, scammers, ransomwarers and identity thieves caused more mayhem than ever before. Troll farms, click farms and fake news factories damaged the integrity of the internet as an information source. Tech companies and media companies became afraid of pissing off the Chinese government.

Windows turned into a big piece of spyware. Intel couldn't be bothered to innovate until AMD Ryzen came along. Nvidia somehow took a full decade to make really basic realtime raytracing happen, even though smaller GPU maker Imagination had done it years earlier with a fraction of the budget, and in a mobile GPU to boot. Top-of-the-line smartphones became seriously expensive. Censorship and shadow banning on the once-more-open internet became a thing. Easily-triggered people trying to muzzle other people on social media became a thing. The quality of popular music and music videos went steadily downhill. Star Wars went to shit after Disney bought it, as did the Star Trek films. And mainstream cinema turned into an endless VFX-heavy comic book movies, remakes/reboots and horror movies fest. In many ways, television was the biggest winner of the 2010s, with many new TV shows with film-like production values being made. The second winner may be computer hardware that delivered more storage/memory/performance per dollar than ever before.

To the question: What, dear Slashdotters, will the 2020s bring us? Will things get better in tech and other things relevant to nerds, or will they get worse?

Slashdot Asks: What's Your Favorite Podcast? (pocketcasts.com) 277

Pocket Casts, one of the most widely used podcast apps, has shared a list of podcasts that were most subscribed by its user base this year. Top 10 podcasts this year were: 1. The Joe Rogan Experience.
2. This American Life.
3. Stuff You Should Know.
4. Serial.
5. The Daily.
6. Reply All.
7. Waveform: The MKBHD Podcast.
8. Dan Carlin's Hardcore History.
9. Radiolab.
10. Invisible.
Did your favorite podcast make it to the list? If not, what are some of the podcasts you listen to that you enjoy?

Ask Slashdot: Will Future TVs Be Able To DeepFake Actor Faces In Realtime?65

dryriver writes: We've all seen the DeepFake videos on Youtube, where a different actor's face from the original is digitally inserted into a film scene. Some of these DeepFakes are actually quite convincing. DeepFakes are currently computationally intensive, but may one day happen in realtime on hardware custom made to accelerate the process. Now to the question: Will this "digital face swapping" be a realtime feature in future TVs some day? Will people be able to say to their TV "I don't like this actor/actress. Replace him/her with _actorname_ please"? Or watch a 100 Million Dollar movie with their own face on an actor's body, essentially making the TV owner the star of the movie playing? Will this perhaps become so normal some day that people in the future look back at our era and say "In those days, you couldn't choose which actors to watch any given piece of content with. Technology wasn't as advanced as it is today back then."?

Ask Slashdot: Is There A Laptop That Uses Rechargeable 18650 Cell Batteries?95

"Present laptop dying, battery of course," writes long-time Slashdot reader ClarkMills.It uses proprietary pouch Lithium cells. Wouldn't it be great to just swap in a new set of 18650s? Okay, it may not be a thin laptop but it would save me from turfing a perfectly good laptop otherwise...
The original submission drew some interesting comments -- including one from long-time Slashdot reader thegreatbob suggesting a used laptop might be the only option."This seems to be due to the notion that 'thinner!' and 'lighter!' are more important than 'doesn't periodically turn into an incendiary pillow!' "

But are there other options?Share your own thoughts in the comments.

Is there a laptop that uses rechargeable 18650 cell batteries?
Christmas Cheer

Ask Slashdot: At What Age Should Toddlers Get Screen Time? (kidshealth.org) 101

Slashdot reader ne0phyte73 writes:I got my first computer (a Commodore 64) when I was 13. My daughter got hers (One Laptop Per Child) when she was 5.

What are the current trends?

I see new AI-powered edutainment products coming to the market, targeted at toddlers. Would you give something like this to your 18 months old? (Kidshealth claims that there should be no screen time at all until 18 months, with the exception of "video chatting with grandparents or other family friends, which is considered quality time interacting with others.").Well, developers of "Animal Island Learning Adventure" claim that they provide quality interaction with AI-powered characters.Do you believe in the claims of developers that this or similar systems help toddlers to develop?

Would you give it to your child?

If this is, in fact, a "quality interaction", would you give it to kids even before they are 18 months old?

One review site said that particular learning adventure offers a tablet "pre-loaded with 60 days of ad-free content" focused on learning skills for preschoolers.Personally, that just makes me worry what would happen after 60 days.But share your own thoughts in the comments.

At what age should toddlers get screen time?
Data Storage

Ask Slashdot: How Important Is Upgradable Storage and Memory When Buying a New Computer?183

davidwr writes: If you were going to buy a desktop or laptop computer, how important is it to be able to upgrade memory and storage after your purchase? Is not being able to upgrade an "automatic no-buy," assuming you can get a computer that meets your needs that is upgradeable? If not, would you be willing to pay a little more for upgradeability?A lot more?

Personally, I like to keep computers 4-6 years, which means I prefer to buy an upgradeable machine then upgrade it after 2 or 3 years using then-much-cheaper or not-available-at-all-today parts. What are your thoughts?

Ask Slashdot: Who Is Most Likely To Challenge Microsoft In the Office?147

Tablizer writes: Microsoft still dominates cubicle-land. Google is making a push into that domain, but it's unclear how far or how fast they can go. Most "serious" applications still run on only Windows and that doesn't seem to be changing much. What's keeping others out? Do we need new desktop-oriented, cross-platform standards? It seems everyone "went web" and forgot about the desktop niche, but it's a big niche still.

Ask Slashdot: Would You Pay To Subscribe To YouTube?177

Long-time Slashdot reader shanen writes:If you don't watch YouTube, then more power to you, but if you do watch it, then I bet you have noticed more and more intrusive and noisy and much longer ads along with frequent reminders that you can pay up and make the noise go away.

Feels like extortion to me and I'm not going to pay a blackmailer. But someone must be paying up. Is it you? Or do you even know anyone who is paying?

The original submission also shares shanen's argument that Google is exploiting copyright loopholes to monetize other people's copyrighted content."It wouldn't even matter how much pirate video is uploaded to YouTube if the Google didn't make it easy to find...If the Google actually wanted to stop the piracy, the algorithm is obvious... The famous content has famous keywords and the searches for those keywords can be whitelisted. Pirate results can be disappeared and replaced with results that belong to the actual creator with legitimate exceptions for fair use."(But instead, the argument goes, they're just asking you for money to remove their ads on that content...)

That's shanen's opinion -- but what's yours?And would you pay to subscribe to YouTube?

Ask Slashdot:Is Your Company Using Linux Desktops?198

SomeoneFromBelgium writes:Yesterday I spoke to a friend of mine who works for a company developing mostly integrated network solutions which are purely Linux-based.He complained that he was unable to convince his IT department to provide him and his fellow developers and testers with a Linux desktop. They stated that "it was more secure when using a VM".

We both agreed that the more likely problem is that the IT department is solely geared towards a Windows desktop environment and that they have neither the skills nor the inclination to support any other platform.

This got me wondering: is this also your experience?

I bet Slashdot's readers have stories to tell, with enlightening experiences in corporate workplaces over the years gone by.So feel free to share your thoughts, opinions, and anecdotes in the comments.

And is your company using Linux desktops?

Can You Use Modern Displays With Vintage Computing Hardware?61

Long-time Slashdot reader 50000BTU_barbecue likes using vintage computers from the 1980s and early 1990s -- " real hardware with all the weirdness that goes with it."

But what do you do for a monitor?Especially when "old CRTs are starting to lose sharpness and brightness and may get tossed or damaged when moving..."We still use the same electrical plugs, and keyboards and joysticks are still similar-looking.But display devices have become these enormous high-resolution devices with fewer and fewer analog inputs... The solution is to use some sort of video upscaler.

There are many devices offered, from cheap Chinese units for about $10 to old professional studio scalers from 10-20 years ago. The Chinese units have no controls and are quite variable in the results obtained. But they're cheap. The old scalers would deliver professional results but are not guaranteed to work with consumer monitors or lock onto the non-standard timings of the non-interlaced "240p" video common on 8-bit computers.

What device do you use?

Leave your own thoughts and suggestions in the comments.How can you use modern displays with vintage computing hardware?

Ask Slashdot: Are We Teaching Children The Wrong Way To Read? (apmreports.org) 333

Slashdot reader Thelasko says his oldest child made some "interesting" statements when they came home from first grade:One particular phrase that bothers me is, "I can read pictures." Recently, I heard a radio show on NPR about whole-language reading instruction, and how it's a terrible way to learn. I've since learned that this is a hotly debated topic. I learned to read in a phonics-only setting. To me, this is the only way to read. I don't look at pictures, or the rest of the sentence unless I am completely clueless about what a word is. This whole-language approach just seems wrong.Have any Slashdot members been through this experience with their children?

Did anyone find good research supporting one way or the other, not just opinion? What is your opinion on whole-language versus phonics only reading instruction?

Other Slashdot readers shared some thoughtful comments. I75BJC wrote:From my personal experience, the Whole Word Method of learning to read did not help me. It limited my vocabulary and, especially, my ability to learn new words by myself. In a word, the Whole Word Method "SUCKS". Big time!

My 3rd grade teacher was horrified at our lack of reading skills (after 2 years of the Whole Word Method) and began teaching Phonics to the class. That helped but she could not dedicate the time to Phonics as if it were the way to read. It helped a lot but it didn't undo the damage that the Whole Word Method caused. Having been taught both Phonics and the Whole Word Method, I would say, from experience, that Phonics is the better method. As an Education Major in college, I would state that my professional opinion is that Phonics is vastly superior.

BTW, the debate between Phonics and the Whole Word Method has been going on for decades -- more than 50 years...

And Iamthecheese wrote:Some children learn better by listening, some by reading, some by doing. Some will learn by phonics best, some by getting cues, and most from a combination of these. You know what a child needs? Teachers and parents who love them enough to try different methods if the child is struggling. That's what's missing.

Schools that are glorified daycare and parents who don't have time for their children are the problem. Fix that and everything falls into place.Love the children enough to make sacrifices for them and treat them as individuals...

Where do other Slashdot readers stand on this debate? Leave your own thoughts in the comments.

Are we teaching children the wrong way to read?

Ask Slashdot: How Do You Teach Inventing To Kids?137

dryriver writes: Everybody seems to think these days that kids desperately need to learn how to code when they turn six years old. But this ignores a glaring fact -- the biggest shortage in the future labor market is not people who can code competently in Python, Java or C++, it is people who can actually discover or invent completely new and better ways of doing things, whether this is in CS, Physics, Chemistry, Biology or other fields. If you look at the history of great inventors, the last truly gifted, driven and prolific non-corporate inventor is widely regarded to be Nikola Tesla, who had around 700 patents to his name by the time he died. After Tesla, most new products, techniques and inventions have come out of corporate, government or similar structures, not from a good old-fashioned, dedicated, driven, independent-minded, one-person inventor who feverishly dreams up new things and new possibilities and works for the betterment of humanity.

How do you teach inventing to kids? By teaching them the methods of Genrikh Altshuller, for example. Seriously, does teaching five to seven year olds 50-year-old CS/coding concepts and techniques do more for society than teaching kids to rebel against convention, think outside the box, turn convention upside down and beat their own path towards solving a thorny problem? Why does society want to create an army of code monkeys versus an army of kids who learn how to invent new things from a young age? Or don't we want little Nikola Teslas in the 21st Century, because that creates "uncertainty" and "risk to established ways of doing things?"
Data Storage

Ask Slashdot: What Happened To Holographic Data Storage? (youtube.com) 86

dryriver writes: In an episode of the BBC's Tomorrow's World broadcasted all the way back in 1984, a presenter shows hands-on how a laser hologram of a real-world object can be recorded onto a transparent plastic medium, erased again by heating the plastic with an electric current, and then re-recorded differently. The presenter states that computer scientists are very interested in holograms because the future of digital data storage may lie in them. This was 35 years ago. Holographic data storage for PCs, smartphones, etc. still is not available commercially. Why is this? Are data storage holograms too difficult to create? Or did nobody do enough research on the subject, getting us all stuck with mechanical hard disks and SSDs instead? Where are the hologram drives that appeared "so promising" three decades ago?