Replies: 3 comments
-
https://www.fingerbank.org/cloudapi/ could also be interesting |
Beta Was this translation helpful? Give feedback.
0 replies
-
Some interesting tools for reconnaissance and exploitation on a network:
Getting insights on whether or not your systems are up to date:
Overview of other nice redteaming tools - https://github.com/infosecn1nja/Red-Teaming-Toolkit |
Beta Was this translation helpful? Give feedback.
0 replies
-
many more tools: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I spent some time going through the open issue list at the Boefjes closed source repository. Many issues did not contain meaningful details beyond the title. We should probably re-evaluate the status of below issues/ideas.
Suggested for implementation
Actually in-progress
Internet.nl
Ideas
Cookie parsing (https://cookiedatabase.org/)
We should start creating cookie objects from HttpHeaders, in this step we should be carefull to normalize values and expiration dates to repeatable values instead of the actual times / values seen in the HttpHeaders. From there we can use the site https://cookiedatabase.org/ for added information via the knowledge base about various cookies.
A simple bit could be created which looks for httpheader ooi's, and then finds cookie headers in them allowing us to create cookie OOI's.
On these cookie ooi's a list of optional arguments is possible:
Set-Cookie: =
Set-Cookie: =; Expires=
Set-Cookie: =; Max-Age=
Set-Cookie: =; Domain=
Set-Cookie: =; Path=
Set-Cookie: =; Secure
Set-Cookie: =; HttpOnly
Set-Cookie: =; SameSite=Strict
Set-Cookie: =; SameSite=Lax
Set-Cookie: =; SameSite=None; Secure
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cookie
Other arguments might raise a finding as being incorrect.
In a second bit we might want to evaluate various of these flags as 'unwanted', eg, cookies that are not ' Secure, HttpOnly, SameSite', could be flagged.
Also, once cookies are properly modeled we can create bits who flag various tracking cookies, or even detect some used software.
Also, cookies who are allowed to persist for a very long time might be flagged.
Create finding on any A/AAAA/CNAME dns record that points to a private IP-range on a public domain
(When a public Domain contains any record pointing to a private IP, its possible for a local network to offer up a webserver on this IP for the domain affected, and by doing to get access to the browser's security domain for the affected domain. Eg: bank.com is secure. internal.bank.com resolves to 10.0.0.5. victim with bank.com cookies connects to malicious (Wifi) network. malicious network directs victim to visit internal.bank.com which is now routable to server hosted by malicious network. malicious network's javascript code, served in bank.com's security domain now has access to cookies (etc) for bank.com on victims browser. We should create a finding for any record that makes this possible. Once the DNS records have been collected, this can be ran as a Bit on any record matching the criteria.)
Resource hash to software details (https://circl.lu/services/hashlookup/)
find the md5sum for a file: (normally we would already have the contents in the resource's proof, or we would maybe have the hash stored already.
sum=$(wget -O- https://cdnjs.cloudflare.com/ajax/libs/jquery/3.6.0/jquery.min.js | md5sum | awk '{print $1}')
Match the hash against the api:
curl -X 'GET' "https://hashlookup.circl.lu/lookup/md5/$sum" -H 'accept: application/json'
Create deny-list for world-writable domains in CSP headers
Some domains allow anyone to create an account, and publish code / files. These domains would be unwise to add in your CSPheaders, since they would allow attackers to simply create an account, and start publishing malicious files who would pass the CSP headers. Once parsed, a BIT could hold the CSP headers against this Deny-list and flag those domains that are not secure.
Examples are github gists, static pages, package indexes that allow third parties to upload packages, pastebin like domains, mail servers where attachments are hosted (eg, googlecontent.com)
Warn about similar sites not redirecting to one domain
Dutch Government policy states that the least possible amount of hostnames should be running a website. This means that next to checking the similarity of ipv4/ipv6 on the same hostname, similarity on non-redirecting domains (eg, www. non www, or typo domains) should raise an issue as there should be just 1 active hostname, and the others should redirect to it.
Marked for investigation
Beta Was this translation helpful? Give feedback.
All reactions