KQL is a corner stone in Microsoft traiging and Threat hunting. Look at the links below to get some idea on how KQL works and tutorials to get your hands dirty.
To figure out what table an organization has with a log analytics workspace to figure out what we have to look for, We can navigate to Microsft Defender -> Advanced Hunting, where we can find a identical grroup to log analytics workspace. Behind the scenes, the defender connects to the log analytics to give us all the non empty databases
Sentinel works in tandem with a log analytics workspace. A log analytics workspace can be its own entity but a Sentinel instance will/must be configured to a log analytics workspace for launching a sentinel.
There are two way of configuring a log analytics workspace to collect the data from azure machines or arc agents(but the server has to be on azure environment)
Sentinel and Log analytics has a one to one relation. We cannot more than one log analytics workspace to a sentinel instance and vice versa.
Sentinel pricing is based on the amount of data it is analyzing. ALong with that cross reginal data transfer if there are instances ot resources in a different regions, which include cross regaional data processing and analyzing.
Sentinal is a native SIEM tool that does:
If we have multiple sentinel workspaces, we can send all the data to a single pane of glass. We can leverage the following:
We can leverage Lighthouse to connect to other environments and use Workspace Manager to replicate the alerts.
The permissions for sentinel are a bit different from other servers. We set these permissions on a resource group level or log analytics workspace level. We cannot do this on a subscription level. If the workspace doesn’t have a sentinel instance, we cannot set the permissions on the workspace level.
We use data connector to ingest data from different types of data sources like palo alto, splunk etc. We can also use the data connector to ingest data from other azure services like entra ID etc. We can also build our own connectors as well if need to. We can also use AWS Connectors to ingest logs from AWS or Azure.
Within connectors, when enabling, we can specify what tables it needs to import from the connector. We can get all the tables from the connector or specific data. Even though we have some logs enabled, we have to make sure that connector is sending data to sentinel and its workspace.
We can use different connector to ingest data into the sentinel, connectors page will show all the configuration that needs to happen in order to ingest logs. We can also get the information on what data is flowing and which table is not.
** Threat Intelligence feed is free in sentinel ** – TAXII Connector to connect to a workspace only connector to set set frequency of data collection.
All these data connectors can be installed in content hub. There are vendor specific connectors but there are also generic connectors like logstash
In order to successfull connect you connectors, we need to fulfill some pre-requisites along with the configuration required for the connector to work. it will show you what needs to be done in order for it to work.
Office 365 Defender for Cloud Entra protection Entra ID Protection Azure Activity Azure DDoS Protection Azure Firewall Defender for XDR Etc are some of the natively available logs within azure to ingest into sentinel
Sentinel mainly works mainly on Connector and what data we are ingesting. There are connectors for all applications you can find in the marketplace.
For Onprem: the best practice would be to install the azure arc agent, now we would not need any AMA Agent, once we have the arc installed, we can collect the data directly using data collection rule in AMA Agent connector, we can collect the logs.
Using DCR, we can now install AMA agent and collect the data directly using Arc. This DCR can be configured directly within the connector page in sentinel. This DCR can be configured at Subscription level or more granular on a resource group level. We can also specify which type of logs wwe want from AMA.
For Windows: Windows security events with AMA For Linux: we also have a syslog collector for linux
These connectors are the same that show up in the log analytics agents when we configure which logs to collect within log analytics workspace.
Data Collection Rules and AMA are going to be central piece going forward. For non azure machines, arc should be installed, once it is installed, it automatically becomes an azure machine, and we can configure anything we want to just like azure machine.
You logs can show up in log analytics workspace without showing up in Sentinel, but this LAW can still act as its own entity, the systems configured directly to send its logs to LAW, then they might not show up in sentinel. We can run queried on our own, but sentinel doesn’t do this for us automatically.
This connector can be installed this connector on a linux machines, then we can configure the end application to send its logs to sentinel using syslog connector via CEF. Most of the thrid party tools use CEF. If we do not have any pre-build connector within azure, we can install CEF on a linux machine, and use it as a log forwarder
It is recommanded, to install CEF on on prem as it using 514 as defaults, from on prem, CEF uses 443, which is open for most cases. But if we have this CEF Server on cloud, we have to open 514 port on the firewall for this to work.
We can use parsers in sentinel to normalize the syslog data we are getting. Some of the connector within sentinel, will include parsers (But some don’t). Most of the newer connector the data comes parse before coming into sentinel. This is how we normalize the data we are receiving to let sentinel or users understand the data we are receiving.
So far, we have configure and sent the data from different source into sentinel but it still cannot analyze the data we are sending.
When you install a connector, we will usually get the analytic rules included within the connector. These analytic rules help us analyze the data we are receiving within sentinel. We have different type of analytics rule: First is a scheduled query and the other is a NRT(Near Real Time) query. These analytic rules will analyze the data and when it finds something, it will create an incident or anything else we ask it do.
For a scheduled query the minimum frequency we can set is 5 min and it have to run once every 2 weeks and whereas NRT, as the name suggests, its near real time. So, we cannot set a frequency for NRT Rules.
Within Sentinel, we have MITRE Attack, this shows all the different attack vectors that are possible. Like if we have something in sentinel to protect aganist all the attack vector in the MITRE Framework. This pane shows us the gaps within out security posture.,
If we are missing something, we can navigate to the MITRE actual page and figure out a KQL to mitigate this.
Similarly with parsers, we also will get some pre-built automation rules or playbook along with the connector but we also have templates to build custom automation rules or playbooks.
Playbooks are just an official sentinel term, under the hood, they are just logic apps