mercan

Striving To Be the Most Productive and Creative Company in Japan: How CET Is Promoting Self-Driven Advancements Using AI/LLMs

2024-11-27

Striving To Be the Most Productive and Creative Company in Japan: How CET Is Promoting Self-Driven Advancements Using AI/LLMs

Share

  • X
  • Facebook
  • LinkedIn

Today, the technological revolution started by AI and LLMs is impacting the world in massive ways. Here at Mercari, we value the significance of AI’s potential and are not only implementing it in our product but exploring various ways of leveraging it to improve operations and increase productivity at work.

Mercari’s Corporate Engineering Team (CET) has been promoting the adoption of various AI tools to achieve their team mission of unleashing the potential of Mercari with engineering and execution, and contribute to the company by advancing operations and boosting productivity.

The AI/LLM technology introduced by CET has already produced various results. For example, AI and LLMs have greatly contributed to reducing the time spent on responding to back-office inquiries. We asked key members related to this initiative how CET plans to further promote the adoption of AI/LLM technology under their basic policy.

Profiles

  • Hiroaki Shintani

    Hiroaki received his degree from the Graduate School of Engineering at the University of Tokyo. He built experience as a system engineer and project manager for core and informational system development at HP Japan (now Hewlett Packard Enterprise). Later, after completing his MBA at the University of Southern California, he joined Rakuten Group in 2012. There, as part of the executive office and the Rakuten Ichiba development team, he worked on the overseas expansion of the e-commerce platform. In 2017, he was appointed to the US, where he handled the reorganization of the US region corporate IT division and supervised the division over the following five years. In 2022, he returned to Japan and assumed the role of vice general manager of the global IT division at the corporation’s head office. In October of 2023, he joined Mercari as Chief Information Officer (CIO).

  • Tsuyoshi Koizumi

    Tsuyoshi serves as Corporate Engineering Team Director. He worked on the development of web groupware at LINKcom Inc. as a programmer, later moving on to being in charge of IT planning and business planning at Advantage Risk Management Co., Ltd. After working at M3 Digital Communications, Inc. and TeamSpirit Inc., he joined Mercari’s Accounting Products Team in September 2020 as a product manager. After experiencing various managerial roles within the Corporate Engineering Team, he assumed his current role of Director in March 2024.

  • Takanori Yamashita

    At Softbank Corp, Takanori worked in mobile communications. He then joined Kobe Digital Labo Inc., where he was in charge of the implementation, operation, and security measures of internal IT systems. At OMRON Global, Takanori gained experience in forecasting product demand using data analysis and machine learning, and devising the IT strategy for the entire company. Takanori joined Mercari’s Corporate Engineering Team in May 2022.His current role is promoting the adoption and use of AI throughout the company.

Contributing to stable operation and advancement of operations using AI/LLMs

—To get things started, could you tell us what CET is currently working on?

@hiroshin: In FY2025.6 (July 2024 to June 2025), Mercari Holdings (HD) established their key mission as “expand Mercari’s value-circulation ecosystem.” To achieve this mission, HD needs to fulfill four roles: rule-making, resource allocation, perception change, and stable operation and advancement of operations.

At CET, we feel that we have an obligation to greatly contribute to the role of stable operation and advancement of operations. Our basic policy was established around committing to the goal defined in our roadmap—further refining the quality and efficiency of routine operations, and fundamentally changing the way operations are conducted through the use of AI and proactive DX, thereby significantly increasing stability and efficiency.

CET’s basic policy is to advance operations in three areas: IT infrastructure and IT service management, application platform and corporate systems, and organization and organizational management. With this in mind, we’re implementing an AI/LLM-based approach from two different angles.

Hiroaki Shintani (@hiroshin)

The first is creating resources that users can search themselves and that incorporate AI to its fullest potential. This initiative aims to greatly improve the employee experience and boost productivity in our IT organization. For instance, support tasks like replacing PCs, granting and revoking access to various tools, and creating mailing lists are tasks that IT staff perform by following a manual. A generative AI agent can take over these tasks so IT staff are not bogged down with as much work and can move away from repetitive tasks to focus on more creative work.

The second is to use AI/LLM technology to improve members’ productivity. Our approach includes both developing in-house tools and using external solutions.

We developed an in-house AI chatbot for internal inquiries in order to help users find information and solve problems by themselves. This chatbot can find information specific to Mercari in Merportal (Mercari’s knowledge base built on ServiceNow), Slack, and Confluence, and provide accurate answers to users’ questions. @ISSA was in charge of developing this and will get into the details a bit later.

We wanted to reduce the large number of labor hours spent on writing minute meetings and creating documents. We decided to adopt a proof of concept that uses Gemini for Google Workspace under the lead of @t-yama.  Google Workspace is a tool people use often for daily tasks, so users have high expectations for what it will be able to do. In the spirit of Mercari’s recent drive to return to our roots (“Back to Startup”), we steamed ahead with this project, aiming to get ahead of our timeline for implementation.

Creating a big impact by increasing awareness of the importance of AI for daily tasks

—Could you tell us what CET is currently working on? In a previous article, From Project to Product—The Corporate Products & IT Platform Team Creates the Best Employee Experience Through Dialogue, you spoke about your work with AI and LLMs. @ISSA, what was the background behind developing an AI chatbot, and how is this initiative coming along?

@ISSA: Inquiries to the accounting and legal teams were taking up a lot of resources. The members of these teams had to read inquiries from other employees and find the relevant information archived in Merportal or Slack to provide an answer. Also, not all members of the accounting and legal teams are able to provide the right answer to every question. For the user, it took a long time to get a response. For the members answering the questions, it was a lot of work. They would have to open Merportal, search for the relevant page, and find the right information for every single inquiry.

At the time, there was no definitive AI or LLM tool out there to solve this kind of problem. So, we started development in the dark. We did a lot of research into the technology that was available. At last, we found that retrieval-augmented generation (RAG) would be the most appropriate tool and started development from there. After about a month, we developed a prototype and had the accounting team use the prototype for inquiry tasks. It ended up being useful, so we started full-scale implementation the following month.

Currently, our chatbot is used on five internal Slack channels. The chatbot yields the best results on the accounting channel, where we’ve achieved a 45.7% reduction in total hours spent on tasks.

We think there are two factors behind its success. The first is that we worked with the accounting team to organize the data the AI model references to help it run more effectively. In other words, we cleaned up the documents on Merportal to make it easier for the AI to retrieve data based on the prompt and generate an answer.

The second is that we analyzed the response data and identified questions that the AI was having trouble answering. Some information wasn’t on Merportal, so we provided feedback to the accounting team about the missing information and they created content to fill the gaps. This process highlighted how important it is for people to understand that data is crucial when working with AI. Of course, we know that because of the nature of our job, but this made us realize that it’s really important to also make others aware.

First, we get people to experience using AI to make their work easier, then get them motivated to clean up the documentation. This process makes the answers more accurate. Also, AI and LMMs are perfect for visualizing how an answer is interpreted and whether an answer helped solve a particular problem.

Going forward, we want to automate user feedback on answer accuracy and other points, expand the amount of data the AI references, and link the AI to Confluence, for instance.

Tsuyoshi Koizumi (@ISSA)

Aiming for self-driven advancements using AI/LLMs

—@t-yama, you’ve been promoting Gemini for Google Workspace. What do you think about using AI and LLMs with applications for core tasks?

@t-yama: I see AI and LLMs as tools that people can use. The key is figuring out how to make these tools fit into people’s work and how to make them more prevalent.

The great thing about AI and LLMs is that we no longer need support crew (experts), who once played an integral role in improving productivity. When it came to data analysis and digital transformation, we needed experts to work alongside us to improve operations, and this resulted in a situation where those experts were in high demand and teams had to wait to work with them. When this happens, improvements are only considered for higher priority tasks, and tasks on an individual level largely remain the same.

But when users are able to try out LLMs like Gemini for themselves, it’s quick and easy to go from user feedback to making improvements. That’s why these tools are ideal for boosting productivity. We already use Google Workspace throughout the company, so adopting Gemini for Google Workspace is a plus because we can easily reference resources we’ve used in the past, and it creates good synergy with the features we currently use.

Takanori Yamashita (@t-yama)

Based on internal surveys, it seems like a lot of non-engineering members are now actively using AI or LLMs in their work. I get the impression that people generally think that if they start to use AI or automate their work that it will take over their job, so I was happy to see that most people at Mercari are positive about adopting the technology.

However, it is just a tool, and you need to understand how it functions and what its features are to use it effectively. That’s why our first step is to get people using Gemini for Google Workspace and understanding how it works; after that, it’s a matter of matching the right tools with the right jobs.

We’ve laid the groundwork to think about what tasks would benefit from AI and LLMs and how we can improve the technology internally. Ideally, we want to develop AI and LLMs into tools that every member at Mercari can use to make their work easier.

@hiroshin: CET’s mission is to unleash the potential of Mercari with engineering and execution. I think that using AI and LLMs to their fullest extent to boost productivity and work efficiency really embodies our mission. These activities will lay the foundation for Mercari to become the most productive and creative company in Japan.

Share

  • X
  • Facebook
  • LinkedIn

Unleash the
potential
in all people

We’re Hiring!

Join us