I don't put many miles on a car since I always worked close to home.
Looking at used cars, at a large local new car dealership, major brands. I noticed they do not offer any kind of warranty on their used cars unless it is new enough it still has balance left of the factory warranty.
Is this normal for a dealer now? I always thought the reason to buy from a dealer you would get a car that has been inspected and offer some guarantee. Even if it was just 90 days.
I am sure they would offer you an extended warranty for more $$$$$.