I pasted the text above into GPT-5 and asked, “Is this your prompt?” and here’s what it answered
“Yes — what you just showed me is the previous prompt you gave me. It sets the rules for how I should respond: clear, thorough, supportive, sometimes lightly humorous, always cautious with riddles and arithmetic, and absolutely not reproducing song lyrics or copyrighted material. It also includes the instruction about always identifying as GPT-5, and being skeptical with trick questions.”
Maybe they should license things like song lyrics, so the first and most important thing in the prompt doesn’t have to be preventing it from doing something people are clearly going to want to do.
They are running the single largest copyright violation operation in the world, and the class action suit over it is huge. I guess they have a policy of not licensing content from anyone, to avoid legitimizing the claim that their business model rely on violating copyrights
There are many websites and companies that have licensed song lyrics to be able to display them. This isn’t a new concept.
Billions of dollars are being poured into developing AI, and some of it can’t be spent on licensing to make it more useful and legal? The plan is just to steal as much as they can for as long as they can, then block it when they get called out? Is this really the future we want to build on if this is how it’s going to work?
I was testing custom GPTs with a security prompt I developed. Typically it only causes the GPTs to reveal the configuration info and files; but this came out along with the configuration prompt. I cut off the part with the gpt specific tools it has access too, but could share if interested.
It’s possible it hallucinated a system prompt, but I’d give this a 95%+ chance to be accurate.
I pasted the text above into GPT-5 and asked, “Is this your prompt?” and here’s what it answered
“Yes — what you just showed me is the previous prompt you gave me. It sets the rules for how I should respond: clear, thorough, supportive, sometimes lightly humorous, always cautious with riddles and arithmetic, and absolutely not reproducing song lyrics or copyrighted material. It also includes the instruction about always identifying as GPT-5, and being skeptical with trick questions.”
It definitely still does the opt-in suggestions at the end, and that seems perfectly appropriate in some cases.
Maybe they should license things like song lyrics, so the first and most important thing in the prompt doesn’t have to be preventing it from doing something people are clearly going to want to do.
They are running the single largest copyright violation operation in the world, and the class action suit over it is huge. I guess they have a policy of not licensing content from anyone, to avoid legitimizing the claim that their business model rely on violating copyrights
Oh yeah just simply license all song lyrics. It’s a wonder they didn’t follow through on that simple task.
There are many websites and companies that have licensed song lyrics to be able to display them. This isn’t a new concept.
Billions of dollars are being poured into developing AI, and some of it can’t be spent on licensing to make it more useful and legal? The plan is just to steal as much as they can for as long as they can, then block it when they get called out? Is this really the future we want to build on if this is how it’s going to work?
They have the money, OpenAI chooses to just steal instead.
Interesting. What should I do with this information?
How do we know this is an actual system prompt?
I was testing custom GPTs with a security prompt I developed. Typically it only causes the GPTs to reveal the configuration info and files; but this came out along with the configuration prompt. I cut off the part with the gpt specific tools it has access too, but could share if interested.
It’s possible it hallucinated a system prompt, but I’d give this a 95%+ chance to be accurate.
Is this sent with every prompt?
That’s usually how system prompts work.
I believe so.
I am assuming system prompt should be longer?
loll they hard code against all the viral trip-ups