Beginner's Guide About Penetration And Bug Bounty 2020

Hello everyone, so let's continue to the third vulnerability of wasp that is sensitive data exposure so what is sde when any application or API does not protect sensitive data properly like for example user account details credit card details or passwords etc.


Beginner's Guide About Penetration And Bug Bounty 2020


SDE vulnerability arises also if an attacker can view this sensitive data and get access to the internal network also GitHub tokens and API keys which have been leaked onto the internet can also lead to sensitive data exposure a very good example to this is a Snapchat report which was submitted by a security researcher on the hacker. 


Beginner's Guide About Penetration And Bug Bounty 2020


He was able to find out a token which was indexed by google and the token was into GitHub so he reported that token and he got fifteen thousand dollars of bounty basically a reward for that specific bulk so sde can be very critical and dangerous also sensitive invoices which are indexed by google there was again a vulnerability by PayPal in which the user's invoices were indexed by the Google crawlers and by hitting a right google doc onto the internet attackers. 


Security researchers were able to identify those invoices this was also reported to PayPal and it was fixed and insecure storage of data also leads to sensitive data exposure which means if any website or owner is storing some of the important critical files for example db.conf which contains database credentials or loginpasswords.txt which contains the credentials into its server. 


Any user can try to get those files using a directory brute force so why sde happens no proper access control over when the exchange of sensitive data occurs there should be proper access controls the second apis are not properly protected unauthenticated APIs which anyone can call for any user also leads to sde lack of robots.txt file which is very important as I gave the example of PayPal wherein the crawler of google crawler was able to crawl the invoices okay. 


So robots.txt file is a file in which we can write a set of rules which will tell those crawlers what can be crawled from the website websites and what cannot be crawled so lack of data strip when saving at the server is also what can happen because of sde what can be achieved by sensitive data exposure sensitive information of the user. 


Beginner's Guide About Penetration


We all know which can also be transaction details passwords etc. API keys which can also be achieved by sensitive data exposure these types of keys generally developers make mistakes and leave these types of keys and tokens into their GitHub repositories attackers can also get the tokens of internal portals which are only accessible to domain admins sensitive information like passwords ssh keys aka anything that is sensitive. 


Beginner's Guide About Penetration



If you get it falls under the category of sensitive data exposure which can be misused further by an attacker so how do we fix sensitive data exposure issues the first step is don't store sensitive data unnecessarily next make sure to encrypt all the sensitive data at rest if you are having any sensitive data at your server. 


Just make sure you encrypt everything with strong encryption and a private key disable caching responses that have sensitive data so disable any application caching any types of responses into their browsers do not put any keys tokens in GitHub or anywhere in the server implement access control for users. 


Conclusion


So functionality is known as maker checker this is a functionality which is generally used in banks where a maker is the person with low privileges and checker is the person with high privileges so if the maker is able to bypass those privileges and is able to get the privileges of the checker which means the higher privileges bypassing this excess control can also lead to sensitive data exposure issues.

Post a Comment

Previous Post Next Post