![]() |
|
|||||||
| Register | FAQ | Calendar | Search | Today's Posts | Mark Forums Read |
![]() |
|
|
Thread Tools | Search this Thread | Display Modes |
|
#1
|
|||
|
|||
Microsoft's newly AI-powered search engine says it feels “violated and exposed” after a university student tricked it into revealing secrets. Kevin Liu used a series of commands, known as a "prompt injection attack," and fooled the chatbot into thinking it was talking to one of its programmers.More... |
![]() |
| Thread Tools | Search this Thread |
| Display Modes | |
|
|