The Art of Discovery: A Penetration Tester’s Journey Through a Django Misconfiguration

[Editors Note: Eirik Valle Kjellby is an amazing gentleman and the latest, as of October 2024, addition to the ever growing penetration testing team at River Security. He continues to amaze me in his hunt for vulnerabilities as part of our continuous and always-on penetration testing efforts. In this article, Eirik shares with us one particular journey that was both insighful and interesting. Have a good read.
~ Chris Dale

In the world of penetration testing, the thrill lies not just in the discovery of vulnerabilities, but in the narrative that unfolds with each test. Today, I want to take you through an enlightening adventure that encapsulates the essence of what it means to be a penetration tester, with a focus on a Django application that was, quite unfortunately, running in debug mode.

I am Eirik and I work as a penetration tester with River Security. Working continuously with micro engagements, every day with new challenges all the time is not only a fun way of working, it is also very effective in terms of challenging the threat actors, and reducing risks to our customers. Enjoy this post on how we leverage multiple vulnerabilities to, in the end, prove significant value to customers and other stakeholders.

Debug Mode: A Goldmine for Attackers

The moment we deployed our testing tools against the target, it was evident that we were in for a treat. The Django application was running in debug mode, a misstep that would soon prove disastrous for our client. This setting revealed a treasure trove of information, including the complete enumeration of all paths exposed in the system. Never run Django in debug mode with internet facing assets!

A picture taken from Django's main website.
The picture was taken from Django’s main website. Django is a popular website building utility. You cane explore it at https://www.djangoproject.com/

As we navigated through the API endpoints, we actually discovered several vulnerabilities. Yet, the real story didn’t just stop there. The debug mode also displayed detailed stack traces and environment variables that are usually kept under wraps. One particular oversight caught our eye: a Python Celery worker was configured to connect to Amazon SQS, and in the process, it inadvertently exposed the AWS Account ID and AWS Secret ID. These two values represent in it’s essence a username and password to AWS.

The Cloud Pivot: A Wide Open Door

Armed with these credentials, we ventured into the cloud realm, where we were greeted by a staggering landscape of over a hundred different deployments. The absence of the principle of least privilege was glaringly evident. This meant that we had unrestricted access to various resources, and the possibilities were both exhilarating and alarming.

The first stop was the S3 buckets. Upon diving into them, we found about fifty buckets, many serving as dynamic code repositories for applications and web servers. With our elevated privileges, we could have updated, patched, and backdoored countless services worldwide. However, we chose restraint. The proof of concept we had gathered was robust enough to present to our customer, allowing them to take necessary action without compromising their integrity further.

During our Active Focus delivery and agile micro pentesting, our testing is reserved to unauthenticated testing; authenticated testing is typically done via usage of a retainer. However, authenticated testing is done by the pentesters own reservation, and we typically do a quick reconaissance to help our customer understand the risk of the compromise, e.g. see available assets from authenticated point of view.

The Ghost of Assets Past

Image from DALL-E – An abstract, ghostly representation of an IT asset like a computer server in a dark, misty environment. The server has a transparent, ethereal quality

Yet, as we delved deeper into the post-exploitation phase, an intriguing puzzle presented itself. None of the buckets seemed to contain our customer’s data. A quick check with our Attack Surface Management tool, Active Focus, revealed that the domain owned by our customer pointed to an EC2 IP address in AWS that had long been released. This asset had transitioned from ‘Stale’ to ‘Historic’ in our platform, indicating it was no longer active. Based on research already done by Asset Note (https://www.assetnote.io/resources/research/eliminating-dangling-elastic-ip-takeovers-with-ghostbuster) I learned about a the concept “danging EC2 IP addresses”.

To our surprise, we found that the EC2 instance had received a new IP address and was now hosting a fresh application. This new instance, however, was out of scope for our testing, having been assigned to another AWS customer. It dawned on us that we had stumbled upon a domain takeover opportunity, albeit inadvertently, via customers releasing EC2 ip adresses and them being assigned to others.

Many wins here, and goes to show how important it is to keep IT infrastructure up to date. In Active Focus we have AWS enumerators which can help identify dangling EC2 IP addresses, something which we have now added to the development roadmap.

Responsible Disclosure: An Ethical Obligation

With this revelation came a sense of responsibility. We immediately initiated a responsible disclosure process to not only inform our client about the potential subdomain takeover but also to alert the third-party service provider about the vulnerabilities and risks associated with their deployment.

In the world of cybersecurity, the stakes are always high, and the implications of vulnerabilities can have far-reaching consequences. While our customer was fortunate to have us uncover these issues, the unintended discovery of a third-party vulnerability was a sobering reminder of the interconnected nature of cloud services.

In the space of continuous penetration testing, the need to validate and verify target scope and ownership becomes increasingly important. Not only to avoid wasting time on the wrong assets, but possibly also for preventing legal issues for organisation and countries who have not yet advanced into legislation surrounding ethical responsible disclosure.

Conclusion: Scoping and a Lesson Learned

In the end, our journey through this Django application was a testament to the importance of secure configurations, vigilance, and the ethical responsibilities that come with penetration testing. While our discovery could have led to significant ramifications had it fallen into the wrong hands, it also reinforced the value of agile and rapid penetration testing methodologies.

At least we could joke that the third-party vendor received a complimentary pentest service, albeit at the cost of their operational security. As we wrapped up our findings, it was clear: in cybersecurity, every misstep can lead to an unexpected opportunity for learning and improvement. And as penetration testers, it’s our job to navigate these landscapes with precision, responsibility, and a keen eye for the narrative that unfolds with every click.