Azure Storage File Explorer

Posted : admin On 1/1/2022

I appreciate this is a couple of years old now - but I'm coming up against this issue when downloading multiple files and zipping them. I provide SAS Urls for single file downloads - works great, but when a user wants to download, say 100 image files, I don't want to provide 100 SAS URls to the browser to download, I want to consolidate them in to a zip file. Azure storage files accessed using EJ1 FileExplorer in Asp Core platform. This example demonstrates how to access azure storage files in the File Explorer in Asp.Net Core platform. In order to achieve this you need to override the FileExplorer basic methods by creating a custom class using the BasicFileOperations abstract class.

In the first post of this series on the Azure Files preview, I discussed what the feature entailed, and what you can do with it, as well as how to sign up for the preview. In the second post, I showed how to get the PowerShell cmdlets for the preview, and how to use them to manage your share and transfer files to/from your share. In this post, I’ll show you how to create a VM in Azure, show you two ways to map the share, and how to copy files to it.

How to create a VM in Azure

I realize many who read this post will already know this; I’m including it for those who are new to Microsoft Azure and want to try out the Azure Files feature.

First, log into your Azure account. Select Virtual Machines from the menu on the left.

Storage Explorer supports Azure RBAC access to Storage Accounts, Blobs, and Queues. If you need access to File Shares or Tables, you'll need to assign Azure roles that grant permission to list storage account keys. Access control lists (ACLs) Access control lists (ACLs) let you control file and folder level access in ADLS Gen2 blob containers. It sounds like you want to use Azure file storage so you can map the account as an smb 3.0 share. Its straight forward. Just make sure your Windows operating system supports smb 3.0 and port 445 isn't blocked for outbound.

Next, at the bottom, select + New, Compute, Virtual Machine, FromGallery.

On the next screen, select Windows Server 2012 R2 Datacenter and click the right arrow in the bottom right hand side of the screen.

On the next screen, fill in a name for the VM. Then specify a username and password. These credentials will be used when you RDP into your VM, so don’t forget them! When you’re finished, click on the arrow on the bottom right-hand side of the screen.

On the next screen, specify “Create a new cloud service” and provide a DNS name for the cloud service. This is kind of a wrapper for the VM. (You can deploy multiple VM’s into the same cloud service, and they will be load balanced under the same IP address.) In my case, I already have a storage account set up. If you don’t, or want to use a different one, you can ask it to generate one for you (it’s an option in the Storage Account dropdown list). For optimal performance, pick a region that is close to your location. When you’re done, click the arrow at the bottom right-hand side of the screen.

The next screen is used to select the VM Extensions to be installed. Leave the checkbox checked for “Install the VM agent”; for the purpose of this exercise, you don’t need any of the other extensions, so just leave them unchecked and click the checkmark in the circle at the bottom right-hand side of the screen to continue.

Now Azure will provision and start up your VM. At this point, you just wait until it says “Running” (like the first one in my list displayed below), and then you’re ready to continue. This seems to take a long time, but you’ll find the wait much more enjoyable if you take a quick run to the closest Starbucks. (I’ll be right back…)

Now click on your VM and it will bring up the Dashboard. (It might bring up the QuickStart screen; if it does, just continue to the Dashboard.)

At the bottom of the screen, you will see the Connect icon is lit up and waiting for you.

Access your Share from inside your VM

Click Connect at the bottom of the portal screen to RDP into your VM. When prompted, specify the username and password that you provided when creating the VM. Click through the security prompts. If it’s the first time you’ve logged into your VM, click Yes when prompted with this screen:

Now let’s attach our share. There are a couple of ways to do this. One is to use the NET USE command.

Open a command window. The easiest way to do this is to click the Windows start button and on the Modern interface, just start typing “command”. This will bring up the search box on the right, and you should see what you’re looking for; click on it to open the command window.

z:

Here is the command to use to connect your share:

C: net use [drive letter]: [storage account name].file.core.windows.net[share name]
/u:[storage account name] [storage account key]

I’m going to mount the share I created in my previous post. It was called ‘nightbird’, and was on ‘nightbirdstorage3’. I’ve blurred out my storage account key in the following example:

Now open Windows Explorer, and you’ll see the share listed.

Now you can double-click on it and see what’s on it. On mine, I can see the images I uploaded and the folder I created in the previous post.

If you double-click on the folder Images, you can see the files in that folder. At this point, this is just like using any network share you’ve ever used on-premises.

Any changes you make from within the VM will of course appear if you go back and use the PowerShell commands to list the files on the share, whether you add, change, or delete files and/or directories.

At this point, if you go back to the command window, you can use the NET USE command to see what shares you have attached.

Another way to access the share

Instead of using the NET USE command, you can actually map the network drive from within Windows Explorer.

Bring up Windows Explorer, right-click on “This PC” and select “Map network drive:”.

Type in what would be the network UNC path to your share, which will be in this format:

[storage account name].file.core.windows.net[share name]

Be sure that “Reconnect at sign-in” is checked. Click Finish to complete the drive mapping.

You will be prompted for username and password. This is for the share, so the username is the storage account name (nightbirdstorage3 in my case) and the password is the account key.

After doing this, it will open Windows Explorer, and show the share and the files and directories in the root.

So that’s the second way to map your network drive.

How do I put files on my share?

This is pretty simple; you can use the RDP session to do that. Just bring up Windows Explorer on your local computer. Select the directory and/or files that you want to copy and click Ctrl-C.

Azure storage file explorer download

Switch to the RDP session. Using Windows Explorer, select where you want the files to go, then click Ctrl-V.

You can also copy the files on the share and paste them into Windows Explorer on the desktop to download them.

Another way to do this is when you click Connect in the portal, it will prompt you to save or open the RDP file. If you save it, you can then go find the RDP file, right-click on it and choose Edit. In the Local Resources tab, you can select More… under Local Devices, and then open up Drives and select a local drive. This will map the drive when you log into the VM, and you can access it as if it were local in the VM. I’m going to attach my D drive and then click OK, select the General tab on the main RDP window, and select Connect to connect to the VM.

Now after I log into my VM and bring up Windows Explorer, I can access that drive from inside the VM:

Now I can copy the files directly from my local computer to the file share accessed by my VM (and vice versa).

Regions and subscriptions and access, oh my!

An important thing to note is that your file share does not need to be in the same account as the VM’s you are going to use to attach it, it just needs to be in the same region.

I have multiple Microsoft accounts that have an Azure subscription. If I create a storage account in US West in one of my Azure accounts, and set up a file share, I can access that file share from any VM in any of my other Azure subscriptions that have VM’s in US West. This could bring up some interesting use cases.

Something to try

When you attach a file share to multiple VM’s, and one of the VM’s changes one of the files, a notification is sent to the other VMs that the file has changed and their view of the file share is updated. To see this work in action, you can follow these steps:

1. Create another VM in the same region as your file share.

2. RDP into the VM and attach the share.

3. RDP into the first VM and bring up the share folder.

4. Change one of the files on the share in one of the VMs.

Azure storage explorer tool

5. Change over to the other RDP session and look at the files on the share, and you will see the update there as well.

Making the file share sticky

When I attach the share using NET USE, log out, restart the VM, and come back in, I have to provide credentials again to use the file share.

Azure Storage File Explorer

If I map the network drive using Windows Explorer, and check the box that says “Reconnect at sign-in”, then when I log out, restart the VM, and come back in, the share is still mapped and available without providing credentials.

This is tied to the user account used to map the drive. So be aware that you might need to map the drive under other user accounts. For more information, check out this article by the Microsoft Azure Storage team that addresses the stickiness of Azure File share mappings.

If you want to access that share from a web site, you would create a virtual directory on the VM hosting the web site. For example, if I wanted nightbirdstorage3.file.core.windows.netnightbirdImages to be accessible as http://contoso.com/images, I would create a virtual directory that points to that folder on the share using that UNC path of the share. When I do that, it requests the credentials for accessing the share, and I can provide the storage account information at that time and it will be sticky if the role is rebooted. (Thanks to Steve Evans for the tip.)

Summary

In this post, I showed you how to create a VM in Azure and attach your share using two different methods. I also showed how to copy files to the share from your local computer. In my next post, I’ll show you how to programmatically create a share, create directories, and upload/download files using the Storage Client Library.

Tags: Azure, AzureFiles, Microsoft

-->

By default, logs ingested into Azure Sentinel are stored in Azure Monitor Log Analytics. This article explains how to reduce retention costs in Azure Sentinel by sending them to Azure Data Explorer (ADX) for long-term retention.

Storing logs in ADX reduces costs while retains your ability to query your data, and is especially useful as your data grows. For example, while security data may lose value over time, you may be required to retain logs for regulatory requirements or to run periodic investigations on older data.

About Azure Data Explorer

ADX is a big data analytics platform that is highly optimized for log and data analytics. Since ADX uses Kusto Query Language (KQL) as its query language, it's a good alternative for Azure Sentinel data storage. Using ADX for your data storage enables you to run cross-platform queries and visualize data across both ADX and Azure Sentinel.

For more information, see the ADX documentation and blog.

When to integrate with ADX

Azure Sentinel provides full SIEM and SOAR capabilities, quick deployment and configuration, as well as advanced, built-in security features for SOC teams. However, the value of storing security data in Azure Sentinel may drop after a few months, once SOC users don't need to access it as often as they access newer data.

If you only need to access specific tables occasionally, such as for periodic investigations or audits, you may consider that retaining your data in Azure Sentinel is no longer cost-effective. At this point, we recommend storing data in ADX, which costs less, but still enables you to explore using the same KQL queries that you run in Azure Sentinel.

You can access the data in ADX directly from Azure Sentinel using the Log Analytics ADX proxy feature. To do so, use cross cluster queries in your log search or workbooks.

Important

Core SIEM capabilities, including Analytic rules, UEBA, and the investigation graph, do not support data stored in ADX.

Note

Integrating with ADX can also enable you to have control and granularity in your data. For more information, see Design considerations.

Send data directly to Azure Sentinel and ADX in parallel

You may want to retain any data with security value in Azure Sentinel to use in detections, incident investigations, threat hunting, UEBA, and so on. Keeping this data in Azure Sentinel mainly benefits Security Operations Center (SOC) users, where typically, 3-12 months of storage are enough.

You can also configure all of your data, regardless of its security value, to be sent to ADX at the same time, where you can store it for longer. While sending data to both Azure Sentinel and ADX at the same time results in some duplication, the cost savings can be significant as you reduce the retention costs in Azure Sentinel.

Tip

This option also enables you to correlate data spread across data stores, such as to enrich the security data stored in Azure Sentinel with operational or long-term data stored in ADX. For more information, see Cross-resource query Azure Data Explorer by using Azure Monitor.

The following image shows how you can retain all of your data in ADX, while sending only your security data to Azure Sentinel for daily use.

For more information about implementing this architecture option, see Azure Data Explorer monitoring.

Export data from Log Analytics into ADX

Instead of sending your data directly to ADX, you can choose to export your data from Log Analytics into ADX via an Azure Event Hub or Azure Data Factory.

Data export architecture

The following image shows a sample flow of exported data through the Azure Monitor ingestion pipeline. Your data is directed to Log Analytics by default, but you can also configure it to export to an Azure Storage Account or Event Hub.

When configuring the data export rules, select the types of logs you want to export. Once configured, new data arriving at the Log Analytics ingestion endpoint, and targeted to your workspace for the selected tables, is exported to your Storage Account or Event hub.

When configuring data for export, note the following considerations:

ConsiderationDetails
Scope of data exportedOnce export is configured for a specific table, all data sent to that table is exported, with no exception. Exported a filtered subset of your data, or limiting the export to specific events, is not supported.
Location requirementsBoth the Azure Monitor / Azure Sentinel workspace, and the destination location (an Azure Storage Account or Event Hub) must be located in the same geographical region.
Supported tablesNot all tables are supported for export, such as custom log tables, which are not supported.
For more information, see Log Analytics workspace data export in Azure Monitor and the list of supported tables.

Data export methods and procedures

Use one of the following procedures to export data from Azure Sentinel into ADX:

  • Via an Azure Event Hub. Export data from Log Analytics into an Event Hub, where you can ingest it into ADX. This method stores some data (the first X months) in both Azure Sentinel and ADX.

  • Via Azure Storage and Azure Data Factory. Export your data from Log Analytics into Azure Blob Storage, then Azure Data Factory is used to run a periodic copy job to further export the data into ADX. This method enables you to copy data from Azure Data Factory only when it nears its retention limit in Azure Sentinel / Log Analytics, avoiding duplication.

This section describes how to export Azure Sentinel data from Log Analytics into an Event Hub, where you can ingest it into ADX. Similar to sending data directly to Azure Sentinel and ADX in parallel, this method includes some data duplication as the data is streamed into ADX as it arrives in Log Analytics.

The following image shows a sample flow of exported data into an Event Hub, from where it's ingested into ADX.

Azure storage account explorer

The architecture shown in the previous image provides the full Azure Sentinel SIEM experience, including incident management, visual investigations, threat hunting, advanced visualizations, UEBA, and more, for data that must be accessed frequently, every X months. At the same time, this architecture also enables you to query long-term data by accessing it directly in ADX, or via Azure Sentinel thanks to the ADX proxy feature. Queries to long-term data storage in ADX can be ported without any changes from Azure Sentinel to ADX.

Note

When exporting multiple data tables into ADX via Event Hub, keep in mind that Log Analytics data export has limitations for the maximum number of Event Hubs per namespace. For more information about data export Log Analytics workspace data export in Azure Monitor.

For most customers, we recommend using the Event Hub Standard tier. Depending on the amount of tables you need to export and the amount of traffic to those tables, you may need to use Event Hub Dedicated tier. For more information, see Event Hub documentation.

Tip

For more information about this procedure, see Tutorial: Ingest and query monitoring data in Azure Data Explorer.

To export data into ADX via an Event Hub:

  1. Configure the Log Analytics data export to an Event Hub. For more information, see Log Analytics workspace data export in Azure Monitor.

  2. Create an ADX cluster and database. For more information, see:

  3. Create target tables. The raw data is first ingested to an intermediate table, where the raw data is stored, manipulated, and expanded.

    An update policy, which is similar to a function applied to all new data, is used to ingest the expanded data into the final table, which has the same schema as the original table in Azure Sentinel.

    Set the retention on the raw table to 0 days. The data is stored only in the properly formatted table, and deleted in the raw table as soon as it's transformed.

    For more information, see Ingest and query monitoring data in Azure Data Explorer.

  4. Create table mapping. Map the JSON tables to define how records land in the raw events table as they come in from an Event Hub. For more information, see Create the update policy for metric and log data.

  5. Create an update policy and attach it to the raw records table. In this step, create a function, called an update policy, and attach it to the destination table so that the data is transformed at ingestion time.

    Note

    This step is required only when you want to have data tables in ADX with the same schema and format as in Azure Sentinel.

    For more information, see Connect an Event Hub to Azure Data Explorer.

  6. Create a data connection between the Event Hub and the raw data table in ADX. Configure ADX with details of how to export the data into the Event Hub.

    Use the instructions in the Azure Data Explorer documentation and specify the following details:

    • Target. Specify the specific table with the raw data.
    • Format. Specify .json as the table format.
    • Mapping to be applied. Specify the mapping table created in step 4 above.
  7. Modify retention for the target table. The default Azure Data Explorer retention policy may be far longer than you need.

    Use the following command to update the retention policy to one year:

This section describes how to export Azure Sentinel data from Log Analytics into Azure Storage, where Azure Data Factory can run a regular job to export the data into ADX.

Using Azure Storage and Azure Data Factory enables you to copy data from Azure Storage only when it's close to the retention limit in Azure Sentinel / Log Analytics. There is no data duplication, and ADX is used only to access data that's older than the retention limit in Azure Sentinel.

Tip

While the architecture for using Azure Storage and Azure Data Factory for your legacy data is more complex, this method can offer larger cost savings overall.

The following image shows a sample flow of exported data into an Azure Storage, from where Azure Data Factory runs a regular job to further export it into ADX.

To export data into ADX via an Azure Storage and Azure Data Factory:

  1. Configure the Log Analytics data export to an Event Hub. For more information, see Log Analytics workspace data export in Azure Monitor.

  2. Create an ADX cluster and database. For more information, see:

  3. Create target tables. The raw data is first ingested to an intermediate table, where the raw data is stored, manipulated, and expanded.

    An update policy, which is similar to a function applied to all new data, is used to ingest the expanded data into the final table, which has the same schema as the original table in Azure Sentinel.

    Set the retention on the raw table to 0 days. The data is stored only in the properly formatted table, and deleted in the raw table as soon as it's transformed.

    For more information, see Ingest and query monitoring data in Azure Data Explorer.

  4. Create table mapping. Map the JSON tables to define how records land in the raw events table as they come in from an Event Hub. For more information, see Create the update policy for metric and log data.

  5. Create an update policy and attach it to the raw records table. In this step, create a function, called an update policy, and attach it to the destination table so that the data is transformed at ingestion time.

    Note

    This step is required only when you want to have data tables in ADX with the same schema and format as in Azure Sentinel.

    For more information, see Connect an Event Hub to Azure Data Explorer.

  6. Create a data connection between the Event Hub and the raw data table in ADX. Configure ADX with details of how to export the data into the Event Hub.

    Use the instructions in the Azure Data Explorer documentation and specify the following details:

    • Target. Specify the specific table with the raw data.
    • Format. Specify .json as the table format.
    • Mapping to be applied. Specify the mapping table created in step 4 above.
  7. Set up the Azure Data Factory pipeline:

    • Create linked services for Azure Storage and Azure Data Explorer. For more information, see:

      • Copy data to or from Azure Data Explorer by using Azure Data Factory.
    • Create a dataset from Azure Storage. For more information, see Datasets in Azure Data Factory.

    • Create a data pipeline with a copy operation, based on the LastModifiedDate properties.

      For more information, see Copy new and changed files by LastModifiedDate with Azure Data Factory.

Design considerations

When storing your Azure Sentinel data in ADX, consider the following elements:

Windows Azure Storage Explorer Download

ConsiderationDescription
Cluster size and SKUPlan carefully for the number of nodes and the VM SKU in your cluster. These factors will determine the amount of processing power and the size of your hot cache (SSD and memory). The bigger the cache, the more data you will be able to query at a higher performance.
We encourage you to visit the ADX sizing calculator, where you can play with different configurations and see the resulting cost.
ADX also has an autoscale capability that makes intelligent decisions to add/remove nodes as needed based on cluster load. For more information, see Manage cluster horizontal scaling (scale out) in Azure Data Explorer to accommodate changing demand.
Hot/cold cacheADX provides control over the data tables that are in hot cache, and return results faster. If you have large amounts of data in your ADX cluster, you may want to break down tables by month, so that you have greater granularity on the data that's present in your hot cache.
For more information, see Cache policy (hot and cold cache)
RetentionIn ADX, you can configure when data is removed from a database or an individual table, which is also an important part of limiting storage costs.
For more information, see Retention policy.
SecuritySeveral ADX settings can help you protect your data, such as identity management, encryption, and so on. Specifically for role-based access control (RBAC), ADX can be configured to restrict access to databases, tables, or even rows within a table. For more information, see Security in Azure Data Explorer and Row level security.
Data sharingADX allows you to make pieces of data available to other parties, such as partners or vendors, and even buy data from other parties. For more information, see Use Azure Data Share to share data with Azure Data Explorer.
Other cost componentsConsider the other cost components for the following methods:
Exporting data via an Azure Event Hub:
- Log Analytics data export costs, charged per exported GBs.
- Event hub costs, charged by throughput unit.
Export data via Azure Storage and Azure Data Factory:
- Log Analytics data export, charged per exported GBs.
- Azure Storage, charged by GBs stored.
- Azure Data Factory, charged per copy of activities run.

Azure File Explorer Download

Next steps

Azure Storage Explorer File Share Permissions

Regardless of where you store your data, continue hunting and investigating using Azure Sentinel.

Azure Storage Explorer Move File

For more information, see: