We have just decided that we will not enable copilot telemetry as it may lead to exactly such a scenario. We would love to know, because maybe we could get rid of some licenses but the risk that we might start measuring people (even if we don’t want to do that) by those metrics is far too high.
My management are measuring code written by AI as a metric, and expect it to go up…
We have just decided that we will not enable copilot telemetry as it may lead to exactly such a scenario. We would love to know, because maybe we could get rid of some licenses but the risk that we might start measuring people (even if we don’t want to do that) by those metrics is far too high.
Run, fast!
How do they measure it?
Makes me think used tokens, which is very easy to fake.
If I were in a malicious environment, I’d be interested in gaming the system, excessively producing AI code even if I never use it.
They asked AI to do it.
Genuinely no idea. Which makes it even more terrifying. I suspect Copilot github stats
That is delightfully gameable. Just don’t do it all at once. ;-)