And it’s beyond obvious in the way LLMs are conditioned, especially if you’re used them long enough to notice trends. Where early on their responses were straight to the point (inaccurate as hell, yes, but that’s not what we’re talking about in this case) today instead they are meandering and full of straight engagement bait - programmed to feign some level of curiosity and ask stupid and needless follow-up questions to “keep the conversation going.” I suspect this is just a way to increase token usage to further exploit and drain the whales who tend to pay for these kinds of services, personally.
There is no shortage of ethical quandaries brought into the world with the rise of LLMs, but in my opinion the locked-down nature of these systems is one of the most problematic; if LLMs are going to be the commonality it seems the tech sector is insistent on making happen, then we really need to push back on these companies being able to control and guide them in their own monetary interests.
And it’s beyond obvious in the way LLMs are conditioned, especially if you’re used them long enough to notice trends. Where early on their responses were straight to the point (inaccurate as hell, yes, but that’s not what we’re talking about in this case) today instead they are meandering and full of straight engagement bait - programmed to feign some level of curiosity and ask stupid and needless follow-up questions to “keep the conversation going.” I suspect this is just a way to increase token usage to further exploit and drain the whales who tend to pay for these kinds of services, personally.
There is no shortage of ethical quandaries brought into the world with the rise of LLMs, but in my opinion the locked-down nature of these systems is one of the most problematic; if LLMs are going to be the commonality it seems the tech sector is insistent on making happen, then we really need to push back on these companies being able to control and guide them in their own monetary interests.