How LINK Transport Powers Just-In-Time Logistics with HERE WeGo Pro
Mohini Todkari — 11 June 2025
9 min read
26 May 2025
Vibe Coding is more popular than ever. You see it all over the internet — videos exploring its nuances, tutorials showing off cool builds, and, of course, the fun and humorous side of it. So why not ride the trend and vibe code our first HERE app together? We'll evolve an "AI-generated" prototype into practical, real-world code with intent.
In this post, we’ll build a simple tool to validate our HERE API key — all while keeping things relaxed and approachable. This isn’t your typical tutorial though. Think of it as a lowkey walkthrough with just enough structure to keep the "vibe."
If you’ve read some of my recent blog posts, you already know I like testing my API keys against various HERE APIs — and I always want to see the response right in the browser. Usually, I get a JSON object back in the response, but sometimes I work with images too (like static map tiles).
What if we built a simple API key validator — a tool that checks whether your HERE API key returns a valid response from multiple HERE APIs? It could save time when debugging, and it gives instant feedback about which services are working.
I’m a frontend engineer, so I’ll be using the tech stack I feel most comfortable with:
React
Vite JS
TanStack React Query
You’re free to pick whatever tools suit you best. Want to use just plain HTML/CSS/JS? Go for it. You can still follow along, and we might even explore alternate setups in future posts.
We will be testing Le Chat to see how good the responses are and how well the simple free-tier Large Language Model can generate a React application.
Follow along my conversation where I build the HERE API KEY Validator here!
I like to start by defining the idea in simple terms—what the application will do and what technology stack will be used. Initially, I was undecided between using React or Vanilla JavaScript, so I asked for guidance on both.
To quickly create the UI, I reviewed the provided information before pasting it into Visual Studio Code.
The response I received was more than satisfactory, and I decided to lean towards using React. This decision aligns with advice from my mentor at the beginning of my front-end journey: "Do everything in React." Additionally, I recalled from previous blog posts how easy it was to use TanStack, so I decided to implement that as well.
After specifying my needs, my next question to the large language model was whether it could implement Vite and TanStack Query for React.
The response from the model provided a clear explanation of the libraries we would use and how to set up the project, including instructions on installing dependencies.
TanStack works well with both Axios and fetch, but my preference for making API requests is using fetch. Therefore, I asked the large language model to implement fetch instead of Axios, and reduces the need for additional libraries in this case. The model successfully implemented fetch using async/await methods, which worked well with the correct endpoints.
Initially, I added the necessary code to my main.jsx file. The large language model provided some service code and endpoints, although some of the endpoints were incorrect, so it's important to verify them. However, the overall code was straightforward and well-structured.
Additionally, the implementation included the UseQueries, which is useful for making multiple API requests.
I encountered some issues while running the code. Typically, when I have a code issue, I provide the full code sample to the chat to see what insights or solutions it might offer.
The large language model usually responds with encouragement, such as "You are making good progress and are on the path to victory." It also provides suggestions on steps to improve and complete the implementation. I really appreciate this kind of feedback.
In this case, I already knew that I needed to define the QueryClient for TanStack React Query, but I was a bit lazy to do it myself, so I thought I'd let the chat handle it. The chat provided a helpful response, suggesting that I create a file named queryClient.js and set up the QueryClient there. This is indeed a proper way to handle it, typically involving importing the QueryClient from another JavaScript file. However, I prefer to manage this setup directly in my main.jsx file instead.
As I am “vibing” I want to write as little as possible myself, I asked the chat to define the QueryClient directly in main.jsx instead of importing it from external JS file. The chat handled this request quite well, providing clear instructions on how to do it. All I had to do was copy and paste the provided code.
Additionally, the chat offered a good explanation of what a QueryClient is and what a QueryClientProvider does, which was very helpful.
I received a standard CORS error, though I can't recall the exact details now, it might have been a fetch request error, but it wasn't a major issue.
When troubleshooting, the first thing I do is paste the error into the chat to get some direction. In this case, the error was related to Cross-Origin Resource Sharing (CORS), a common issue in front-end development when one machine cannot communicate with another. It was interesting to see this error, especially since the API is public and accessible with an API key.
Typically, you can create a proxy to handle such issues, allowing fetch requests to be sent from the local host. However, since this is a public API, it should be accessible with just an API key. It's intriguing why the CORS error appeared in the first place.
The chat provided guidance on how to modify the API request to use a proxy. You can see the addition of /api in front of the endpoint in the services constant:
Initially, I gathered the services I wanted to test, expecting the chat to build correct API requests. However, it didn't quite meet my expectations.
I then asked to update the endpoints in the services constant so that each service would include /api in its path. For example, each service endpoint would be modified to something like /api/v1/geocode.
With these updates, the application should work correctly.
And with this my ‘first’ vibe coded app was done. To be honest, I had to make some improvements by myself, and we cannot call this a 100% vibe coded app but it is working well for how minimal amount of time it took me to built it.
Feel free to expand on it as the complete and working application can be cloned from GitHub repository.
Stay tuned for the next blog post, where I will delve into the code on GitHub and provide a detailed explanation of the final application.
Happy (VIBE) coding and see you in the next one.
Alberts Jekabsons
Sr. Developer Evangelist
Share article
Why sign up:
Latest offers and discounts
Tailored content delivered weekly
Exclusive events
One click to unsubscribe