Technology reporter
“It’s easier to get permission than permission,” says John, a software engineer of Financial Services Technology Company. “Please work on it. And if you get into trouble later, clear it.”
He is one of many people using his own AI tools at work without the permission of the IT department (so he does not use John’s full name).
According to a survey by Software AG, half of all knowledge workers use personal AI tools.
In this study, knowledge workers are defined as “mainly working for desks and computer”.
While IT teams do not provide AI tools, others say they want to choose tools themselves.
John’s Company offers GitHub Copilot for software development supported by AI, but prefers cursor.
“It’s mainly glorious auto complete, but it’s very good,” he says. “It’s completed 15 lines at a time, and you look over it and say,” Yes, I would have typed. ” It will release you.
He says that his illegal use is easier than violating the policy, rather than risking a long -approval process. “I’m too negligible, and I have enough salary to follow the cost,” he says.
John recommends that companies maintain flexibility in the selection of AI tools. “I told people at work not to update the team license at a time because the entire scenery would change three months later,” he said. “Everyone wants to do something different, and you will feel trapped in the sinking cost.”
The recent release of Deepseek, an AI model that can be freely available from China, may increase the AI option.
Peter (not his real name) is a product manager of a data storage company and provides Google Gemini AI chatbots to people.
External AI tools are prohibited, but Peter uses Chatgpt via search tools. He feels that the biggest advantage of AI comes from challenging his thoughts when asking his chatbot to respond to his plan from a different customer perspective.
“AI is not giving you the more you provide a sparring partner,” he says. “As a product manager, you have a lot of responsibilities, there are many good outlets to discuss your strategy openly. These tools make them free and unlimited.”
The Chatgpt version (4O) he uses can analyze the video. “You can get a summary of videos of competitors and have a conversation (with AI tools) about how the video points and how they overlap with their products.”
In a 10 -minute Chatgpt conversation, he can check the materials that take 2-3 hours to watch the video.
He presumed that improving his productivity is equivalent to getting one -third of the company who works for free.
He doesn’t know why the company banned external AI. “I think it’s a control,” he says. “Companies want to have the right to speak about the tools used by employees. It’s a new frontier and just wants to be conservative.”
Use of unauthorized AI applications is sometimes called “Shadow AI”. This is a more specific version of “Shadow IT”. This is when someone uses software or not approved by the IT department.
Highly wave security can help you identify Shadow AI and prevent corporate data from being input inappropriate to the AI tool.
It tracks more than 10,000 AI apps, of which more than 5,000 are used.
These include a custom version of Chatgpt and a business software that adds AI functions such as the communication tool Slack.
No matter how popular it is, Shadow AI is risky.
The latest AI tool is a process called training, and is built by digesting a huge amount of information.
With about 30 % of application harmonic security, trains are used using information entered by users.
In other words, the user information becomes part of the AI tool and may be output to other users in the future.
Companies may be concerned that the corporate secrets have been published by the answer to the AI tools, but Alastair Paterson, a CEO and co -founder of Harmonic Security, believes it is unlikely. 。 “It’s quite difficult to delete data directly from these (AI tools),” he says.
However, companies are concerned that data cannot be controlled, has no recognition, and is saved to vulnerable AI services for data infringement.
Companies can be very useful, especially for young workers, so it is difficult to fight the use of AI tools.
“(AI) can pack 5 years of experience into a 30-second quick engineering,” he said, Simon HAIGHTON-WILLIAMS, the CEO of Adapavist Group, a UK-based software service guru. I mentioned.
“It is completely replaced (experience), but it is a good leg, just as having a good encyclopedia or computing machine can do what you couldn’t without those tools. “
What do he say to a company that discovered that they had the use of Shadow AI?
“Welcome to the club. I think everyone is doing so. Be patient, understand what people use, and understand how to accept and manage them. So. Not an organization (adopted AI).
Trimble provides software and hardware to manage data on the built environment. The company has created Trimble Assistant so that employees can use AI safely. This is an internal AI tool based on the same AI model used in Chatgpt.
Employees can consult a Trimble Assistant for a wide range of applications, such as product development, customer support, and market research. The company offers GitHub Copilot for software developers.
Karoliina Torttila is the director of Trimble’s AI. “I recommend everyone to explore all kinds of tools in their lives, but their occupation life is another space, and there are some protection and consideration. She knows something. “
The company encourages employees to explore new AI models and applications online.
“This brings us to the skills that we have forced to develop. We need to understand what sensitive data is,” she says.
“There is a place where medical information is not included, so you must be able to make such a type of judgment (also work data).”
She believes that the experience of employees using AI at home and using personal projects can form a corporate policy as the AI tools evolve.
She needs to have a “constant dialogue about which tools will help us.”