Hunting for Hidden API Endpoints Using Katana and Hakraler

Anas H Hmaidy
4 min readSep 9, 2024

--

Good day!

I hope you are doing well. In bug bounty, don’t ever say: This idea is silly to try; I’m not going to do it because the result will be nothing.

I mean, if you look at bug bounty hunting, you are really doing black-box testing, so you must play around and miss your target until you find something to report.

الحمدلله و الصلاة و السلام على سيدنا محمد

اللهم انصر اخواننا في فلسطين

The target I am hunting has already several bugs that I found and reported before. I used to think to myself, “I know everything about this website. If something new appears, I will know it.”

After watching Hussein Daher’s talk, he said that he heavily relies on Katana to crawl the website.

Web crawling, in the context of bug bounty refers to the process of automatically navigating through a website’s structure to discover various resources such as web pages, files and scripts. This is done by following links and examining directories to create a map of a website’s content.

I was initially refusing to use it in the program I was hunting on because I thought to myself, “It will be useless since I know all the URLs; there will be nothing new.”

But I decided, Ok, let’s give it a try. I used katana and an awesome tool called hakrawler from hakluke.

By getting the archive URLs first:

echo "http://target.com" | waybackurls | anew urls.txt

Here I had about 71,000 URLs.

Then by running all the URLs on katana and also hakrawler:

cat urls.txt | katana | hakrawler -d 3 | anew katana.txt

katana tool has alot of good options, but I am new to it so I decided to run it as it is.

the -d option in hakrawler command tells the Depth to crawl [Default is 2].

I left it running for about two days. I got 41,000 URLs in katana.txt

A lot of those URLs were unique in the paths but not in the subdomains. For example:

https://sub1.target.com/account
https://sub2.target.com/account

So, I wanted to get unique paths only. By using the unfurl from Tomnomnom, we can filter the paths only:

cat katana.txt urls.txt | unfurl format %p | anew paths.txt

I got around 5,500 unique paths.

With the unfurl tool, you can also extract the subdomains. So I extracted the unique ones:

cat katana.txt urls.txt | unfurl format %d | anew subs.txt

While checking paths.txt, which contains only the unique paths, I initially found nothing new! All the paths were old to me , but wait… I was wrong.

There was one path that caught my eyes, which I hadn’t seen before! Guess what the path was? It was /HelpApi/

Opening this path in the main domain target.com/HelpApi/, I was able to get the documentation for the APIs on the website!

Now I had a lot to explore. I found an interesting endpoint:

/account/subscribe?groupId=123

I could smell an IDOR bug here. Hitting the request above returned the users in group ID 123. By changing the group ID parameter, I was able to access the information of users in other groups. The information included email addresses, roles, phone numbers, and other private data.

I reported it as a PII (Personally Identifiable Information) disclosure, and alhamdulillah, it was triaged the next day, fixed, and I was rewarded.

Just a meme.

I hope you liked it. Yes it’s a simple bug, but it came after a couple of days of recon. Also huge thanks to Hussien Daher for his last talk. I really recommend you guys to watch it!

Don’t forget to follow me and leave a clap (You can do it up to 50 times!) Thanks for reading.

LinkedIn: anas_hmaidy

Twitter : anasbetis023

Join my telegram channel: anas_hmaidy

Buy me a coffee : anas_hmaidy

Cheers :)

--

--

Anas H Hmaidy
Anas H Hmaidy

Written by Anas H Hmaidy

Cybersecurity Researcher | Web App Penetration Tester | Bug Bounty Hunter | CTF Player

Responses (12)