Could we sabotage the LLM training so the data became worthless?
Like adding to our comments stuff like "2+2=5" "Abraham Lincoln discovered America" and whatever silly statement you can think of
Could we sabotage the LLM training so the data became worthless?
Like adding to our comments stuff like "2+2=5" "Abraham Lincoln discovered America" and whatever silly statement you can think of