Yes. For at least thirty years, the insurance companies have been the power running the American medical system. When (usually right wing) politicians say that this is good, because the market economy will help bring prices down, I always wonder if they are really that stupid, or whether it is dishonesty. The insurance companies take in the money, set the prices, and skim their profit off the top. The executives justify their salaries and bonuses because of how many paper-pushers they have as underlings. The bigger and more costly the system, the more they profit. Oh, and taking care of people and helping them take care of their health? That sometimes happens too, as an afterthought.