diff --git a/sdk/storage/Azure.Storage.Files.DataLake/README.md b/sdk/storage/Azure.Storage.Files.DataLake/README.md index 3f7f6ec6e3415..7469fbf641aa2 100644 --- a/sdk/storage/Azure.Storage.Files.DataLake/README.md +++ b/sdk/storage/Azure.Storage.Files.DataLake/README.md @@ -1,6 +1,7 @@ # Azure Storage Files Data Lake client library for .NET > Server Version: 2019-02-02 + Azure Data Lake includes all the capabilities required to make it easy for developers, data scientists, and analysts to store data of any size, shape, and speed, and do all types of processing and analytics across platforms and languages. It removes the complexities of ingesting and storing all of your data @@ -15,7 +16,7 @@ while making it faster to get up and running with batch, streaming, and interact Install the Azure Storage Files Data Lake client library for .NET with [NuGet][nuget]: ```Powershell -dotnet add package Azure.Storage.Files.DataLake --version 12.0.0-preview.4 +dotnet add package Azure.Storage.Files.DataLake --version 12.0.0-preview.5 ``` ### Prerequisites @@ -33,39 +34,181 @@ az storage account create --name MyStorageAccount --resource-group MyResourceGro ## Key concepts -TODO: Add Key Concepts +DataLake Storage Gen2 was designed to: +- Service multiple petabytes of information while sustaining hundreds of gigabits of throughput +- Allow you to easily manage massive amounts of data + +Key Features of DataLake Storage Gen2 include: +- Hadoop compatible access +- A superset of POSIX permissions +- Cost effective in terms of low-cost storage capacity and transactions +- Optimized driver for big data analytics + +A fundamental part of Data Lake Storage Gen2 is the addition of a hierarchical namespace to Blob storage. The hierarchical namespace organizes objects/files into a hierarchy of directories for efficient data access. + +In the past, cloud-based analytics had to compromise in areas of performance, management, and security. Data Lake Storage Gen2 addresses each of these aspects in the following ways: +- Performance is optimized because you do not need to copy or transform data as a prerequisite for analysis. The hierarchical namespace greatly improves the performance of directory management operations, which improves overall job performance. +- Management is easier because you can organize and manipulate files through directories and subdirectories. +- Security is enforceable because you can define POSIX permissions on directories or individual files. +- Cost effectiveness is made possible as Data Lake Storage Gen2 is built on top of the low-cost Azure Blob storage. The additional features further lower the total cost of ownership for running big data analytics on Azure. + +Data Lake Storage Gen2 offers two types of resources: + +- The _filesystem_ used via 'DataLakeFileSystemClient' +- The _path_ used via 'DataLakeFileClient' or 'DataLakeDirectoryClient' + +|ADLS Gen2 | Blob | +| --------------------------| ---------- | +|Filesystem | Container | +|Path (File or Directory) | Blob | + +Note: This client library does not support hierarchical namespace (HNS) disabled storage accounts. ## Examples -TODO: Add Examples +### Create a DataLakeServiceClient +```C# Snippet:SampleSnippetDataLakeServiceClient_Create +StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); -## Troubleshooting +// Create DataLakeServiceClient using StorageSharedKeyCredentials +DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); +``` -All File DataLake service operations will throw a -[RequestFailedException][RequestFailedException] on failure with -helpful [`ErrorCode`s][error_codes]. Many of these errors are recoverable. +### Create a DataLakeFileSystemClient +```C# Snippet:SampleSnippetDataLakeFileSystemClient_Create +StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); -TODO: Update sample +// Create DataLakeServiceClient using StorageSharedKeyCredentials +DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); -```c# -string connectionString = ""; -// Try to create a container named "sample-container" and avoid any potential race -// conditions that might arise by checking if the container exists before creating -BlobContainerClient container = new BlobContainerClient(connectionString, "sample-container"); -try -{ - container.Create(); -} -catch (RequestFailedException ex) - when (ex.ErrorCode == BlobErrorCode.ContainerAlreadyExists) +// Create a DataLake Filesystem +DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem")); +filesystem.Create(); +``` + +### Create a DataLakeDirectoryClient +```C# Snippet:SampleSnippetDataLakeDirectoryClient_Create +StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + +// Create DataLakeServiceClient using StorageSharedKeyCredentials +DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + +// Get a reference to a filesystem named "sample-filesystem-append" and then create it +DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem-append")); +filesystem.Create(); + +// Create +DataLakeDirectoryClient directory = filesystem.GetDirectoryClient(Randomize("sample-file")); +directory.Create(); +``` + +### Create a DataLakeFileClient + +Create DataLakeFileClient from a DataLakeDirectoryClient +```C# Snippet:SampleSnippetDataLakeFileClient_Create_Directory +// Create a DataLake Directory +DataLakeDirectoryClient directory = filesystem.CreateDirectory(Randomize("sample-directory")); +directory.Create(); + +// Create a DataLake File using a DataLake Directory +DataLakeFileClient file = directory.GetFileClient(Randomize("sample-file")); +file.Create(); +``` + +Create DataLakeFileClient from a DataLakeFileSystemClient +```C# Snippet:SampleSnippetDataLakeFileClient_Create +// Create a DataLake Filesystem +DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem")); +filesystem.Create(); + +// Create a DataLake file using a DataLake Filesystem +DataLakeFileClient file = filesystem.GetFileClient(Randomize("sample-file")); +file.Create(); +``` + +### Appending Data to a DataLake File +```C# Snippet:SampleSnippetDataLakeFileClient_Append +// Create a file +DataLakeFileClient file = filesystem.GetFileClient(Randomize("sample-file")); +file.Create(); + +// Append data to the DataLake File +file.Append(File.OpenRead(sampleFilePath), 0); +file.Flush(SampleFileContent.Length); +``` + +### Reading Data from a DataLake File +```C# Snippet:SampleSnippetDataLakeFileClient_Read +Response fileContents = file.Read(); +``` + +### Listing/Traversing through a DataLake Filesystem +```C# Snippet:SampleSnippetDataLakeFileClient_List +foreach (PathItem pathItem in filesystem.ListPaths()) { - // Ignore any errors if the container already exists + names.Add(pathItem.Name); } ``` +### Set Permissions on a DataLake File +```C# Snippet:SampleSnippetDataLakeFileClient_SetPermissions +// Create a DataLake file so we can set the Access Controls on the files +DataLakeFileClient fileClient = filesystem.GetFileClient(Randomize("sample-file")); +fileClient.Create(); + +// Set the Permissions of the file +fileClient.SetPermissions(permissions: "rwxrwxrwx"); +``` + +### Set Access Controls (ACLs) on a DataLake File +```C# Snippet:SampleSnippetDataLakeFileClient_SetAcls +// Create a DataLake file so we can set the Access Controls on the files +DataLakeFileClient fileClient = filesystem.GetFileClient(Randomize("sample-file")); +fileClient.Create(); + +// Set Access Control List +fileClient.SetAccessControl("user::rwx,group::r--,mask::rwx,other::---"); +``` + +### Get Access Controls (ACLs) on a DataLake File +```C# Snippet:SampleSnippetDataLakeFileClient_GetAcls +// Get Access Control List +PathAccessControl accessControlResponse = fileClient.GetAccessControl(); +``` + +### Rename a DataLake File +```C# Snippet:SampleSnippetDataLakeFileClient_RenameFile +DataLakeFileClient renamedFileClient = fileClient.Rename("sample-file2"); +``` + +### Rename a DataLake Directory +```C# Snippet:SampleSnippetDataLakeFileClient_RenameDirectory +DataLakeDirectoryClient renamedDirectoryClient = directoryClient.Rename("sample-directory2"); +``` + +### Get Properties on a DataLake File +```C# Snippet:SampleSnippetDataLakeFileClient_GetProperties +// Get Properties on a File +PathProperties filePathProperties = fileClient.GetProperties(); +``` +### Get Properties on a DataLake Directory +```C# Snippet:SampleSnippetDataLakeDirectoryClient_GetProperties +// Get Properties on a Directory +PathProperties directoryPathProperties = directoryClient.GetProperties(); +``` + +## Troubleshooting + +All File DataLake service operations will throw a +[RequestFailedException][RequestFailedException] on failure with +helpful [`ErrorCode`s][error_codes]. Many of these errors are recoverable. + ## Next steps -TODO: Link Samples +Get started with our [DataLake samples][samples]: + +1. [Hello World](samples/Sample01a_HelloWorld.cs): Append, Read, and List DataLake Files (or [asynchronously](samples/Sample01b_HelloWorldAsync.cs)) +2. [Auth](samples/Sample02_Auth.cs): Authenticate with public access, shared keys, shared access signatures, and Azure Active Directory. ## Contributing @@ -82,7 +225,7 @@ For more information see the [Code of Conduct FAQ][coc_faq] or contact [opencode@microsoft.com][coc_contact] with any additional questions or comments. -![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-net%2Fsdk%2Fstorage%2FAzure.Storage.Blobs.Cryptography%2FREADME.png) +![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-net%2Fsdk%2Fstorage%2FAzure.Storage.Files.DataLake%2FREADME.png) [source]: https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/storage/Azure.Storage.Files.DataLake/src @@ -104,4 +247,5 @@ additional questions or comments. [cla]: https://cla.microsoft.com [coc]: https://opensource.microsoft.com/codeofconduct/ [coc_faq]: https://opensource.microsoft.com/codeofconduct/faq/ -[coc_contact]: mailto:opencode@microsoft.com \ No newline at end of file +[coc_contact]: mailto:opencode@microsoft.com +[samples]: samples \ No newline at end of file diff --git a/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01a_HelloWorld.cs b/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01a_HelloWorld.cs index eeb8b2698532f..f3567d58a17d2 100644 --- a/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01a_HelloWorld.cs +++ b/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01a_HelloWorld.cs @@ -18,10 +18,161 @@ namespace Azure.Storage.Files.DataLake.Samples /// public class Sample01a_HelloWorld : SampleTest { + /// + /// Create a DataLake File using a DataLake Filesystem. + /// + [Test] + public void CreateFileClient_Filesystem() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + #region Snippet:SampleSnippetDataLakeServiceClient_Create + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + #endregion Snippet:SampleSnippetDataLakeServiceClient_Create + + #region Snippet:SampleSnippetDataLakeFileClient_Create + // Create a DataLake Filesystem + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem")); + filesystem.Create(); + + // Create a DataLake file using a DataLake Filesystem + DataLakeFileClient file = filesystem.GetFileClient(Randomize("sample-file")); + file.Create(); + #endregion Snippet:SampleSnippetDataLakeFileClient_Create + + // Verify we created one file + Assert.AreEqual(1, filesystem.ListPaths().Count()); + + // Cleanup + filesystem.Delete(); + } + + /// + /// Create a DataLake File using a DataLake Directory. + /// + [Test] + public void CreateFileClient_Directory() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + #region Snippet:SampleSnippetDataLakeFileSystemClient_Create + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Create a DataLake Filesystem + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem")); + filesystem.Create(); + #endregion Snippet:SampleSnippetDataLakeFileSystemClient_Create + #region Snippet:SampleSnippetDataLakeFileClient_Create_Directory + // Create a DataLake Directory + DataLakeDirectoryClient directory = filesystem.CreateDirectory(Randomize("sample-directory")); + directory.Create(); + + // Create a DataLake File using a DataLake Directory + DataLakeFileClient file = directory.GetFileClient(Randomize("sample-file")); + file.Create(); + #endregion Snippet:SampleSnippetDataLakeFileClient_Create_Directory + + // Verify we created one file + Assert.AreEqual(1, filesystem.ListPaths().Count()); + + // Cleanup + filesystem.Delete(); + } + + /// + /// Create a DataLake Directory. + /// + [Test] + public void CreateDirectoryClient() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + #region Snippet:SampleSnippetDataLakeDirectoryClient_Create + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Get a reference to a filesystem named "sample-filesystem-append" and then create it + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem-append")); + filesystem.Create(); + + // Create + DataLakeDirectoryClient directory = filesystem.GetDirectoryClient(Randomize("sample-file")); + directory.Create(); + #endregion Snippet:SampleSnippetDataLakeDirectoryClient_Create + + // Verify we created one directory + Assert.AreEqual(1, filesystem.ListPaths().Count()); + + // Cleanup + filesystem.Delete(); + } + /// /// Upload a file to a DataLake File. /// [Test] + public void Append_Simple() + { + // Create Sample File to read content from + string sampleFilePath = CreateTempFile(SampleFileContent); + + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Get a reference to a filesystem named "sample-filesystem-append" and then create it + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem-append")); + filesystem.Create(); + try + { + #region Snippet:SampleSnippetDataLakeFileClient_Append + // Create a file + DataLakeFileClient file = filesystem.GetFileClient(Randomize("sample-file")); + file.Create(); + + // Append data to the DataLake File + file.Append(File.OpenRead(sampleFilePath), 0); + file.Flush(SampleFileContent.Length); + #endregion Snippet:SampleSnippetDataLakeFileClient_Append + + // Verify the contents of the file + PathProperties properties = file.GetProperties(); + Assert.AreEqual(SampleFileContent.Length, properties.ContentLength); + } + finally + { + // Clean up after the test when we're finished + filesystem.Delete(); + } + } + + /// + /// Upload file by appending each part to a DataLake File. + /// + [Test] public void Append() { // Create three temporary Lorem Ipsum files on disk that we can upload @@ -37,7 +188,7 @@ public void Append() StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); - // Get a reference to a FileSystemClient + // Create DataLakeServiceClient using StorageSharedKeyCredentials DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); // Get a reference to a filesystem named "sample-filesystem-append" and then create it @@ -106,7 +257,9 @@ public void Read() file.Flush(SampleFileContent.Length); // Download the DataLake file's contents and save it to a file + #region Snippet:SampleSnippetDataLakeFileClient_Read Response fileContents = file.Read(); + #endregion Snippet:SampleSnippetDataLakeFileClient_Read using (FileStream stream = File.OpenWrite(downloadPath)) { fileContents.Value.Content.CopyTo(stream); @@ -149,10 +302,12 @@ public void List() // List all the directories List names = new List(); + #region Snippet:SampleSnippetDataLakeFileClient_List foreach (PathItem pathItem in filesystem.ListPaths()) { names.Add(pathItem.Name); } + #endregion Snippet:SampleSnippetDataLakeFileClient_List Assert.AreEqual(3, names.Count); Assert.Contains("sample-directory1", names); Assert.Contains("sample-directory2", names); @@ -293,12 +448,14 @@ public void SetPermissions() filesystem.Create(); try { + #region Snippet:SampleSnippetDataLakeFileClient_SetPermissions // Create a DataLake file so we can set the Access Controls on the files DataLakeFileClient fileClient = filesystem.GetFileClient(Randomize("sample-file")); fileClient.Create(); // Set the Permissions of the file fileClient.SetPermissions(permissions: "rwxrwxrwx"); + #endregion Snippet:SampleSnippetDataLakeFileClient_SetPermissions // Get Access Control List PathAccessControl accessControlResponse = fileClient.GetAccessControl(); @@ -333,15 +490,18 @@ public void SetGetAcls() filesystem.Create(); try { + #region Snippet:SampleSnippetDataLakeFileClient_SetAcls // Create a DataLake file so we can set the Access Controls on the files DataLakeFileClient fileClient = filesystem.GetFileClient(Randomize("sample-file")); fileClient.Create(); // Set Access Control List fileClient.SetAccessControl("user::rwx,group::r--,mask::rwx,other::---"); - + #endregion Snippet:SampleSnippetDataLakeFileClient_SetAcls + #region Snippet:SampleSnippetDataLakeFileClient_GetAcls // Get Access Control List PathAccessControl accessControlResponse = fileClient.GetAccessControl(); + #endregion Snippet:SampleSnippetDataLakeFileClient_GetAcls // Check Access Control permissions Assert.AreEqual("user::rwx,group::r--,mask::rwx,other::---", accessControlResponse.Acl); @@ -354,7 +514,7 @@ public void SetGetAcls() } /// - /// Rename a DataLake file and a DatLake directories in a DataLake Filesystem. + /// Rename a DataLake file and a DataLake directory in a DataLake Filesystem. /// [Test] public void Rename() @@ -378,7 +538,9 @@ public void Rename() directoryClient.Create(); // Rename directory with new path/name and verify by making a service call (e.g. GetProperties) + #region Snippet:SampleSnippetDataLakeFileClient_RenameDirectory DataLakeDirectoryClient renamedDirectoryClient = directoryClient.Rename("sample-directory2"); + #endregion Snippet:SampleSnippetDataLakeFileClient_RenameDirectory PathProperties directoryPathProperties = renamedDirectoryClient.GetProperties(); // Delete the sample directory using the new path/name @@ -389,7 +551,9 @@ public void Rename() fileClient.Create(); // Rename file with new path/name and verify by making a service call (e.g. GetProperties) + #region Snippet:SampleSnippetDataLakeFileClient_RenameFile DataLakeFileClient renamedFileClient = fileClient.Rename("sample-file2"); + #endregion Snippet:SampleSnippetDataLakeFileClient_RenameFile PathProperties filePathProperties = renamedFileClient.GetProperties(); // Delete the sample directory using the new path/name @@ -401,5 +565,50 @@ public void Rename() filesystem.Delete(); } } + + /// + /// Get Properties on a DataLake File and a Directory + /// + [Test] + public void GetProperties() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Get a reference to a filesystem named "sample-filesystem-rename" and then create it + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem")); + filesystem.Create(); + try + { + // Create a DataLake Directory to rename it later + DataLakeDirectoryClient directoryClient = filesystem.GetDirectoryClient(Randomize("sample-directory")); + directoryClient.Create(); + + #region Snippet:SampleSnippetDataLakeDirectoryClient_GetProperties + // Get Properties on a Directory + PathProperties directoryPathProperties = directoryClient.GetProperties(); + #endregion Snippet:SampleSnippetDataLakeDirectoryClient_GetProperties + + // Create a DataLake file + DataLakeFileClient fileClient = filesystem.GetFileClient(Randomize("sample-file")); + fileClient.Create(); + + #region Snippet:SampleSnippetDataLakeFileClient_GetProperties + // Get Properties on a File + PathProperties filePathProperties = fileClient.GetProperties(); + #endregion Snippet:SampleSnippetDataLakeFileClient_GetProperties + } + finally + { + // Clean up after the test when we're finished + filesystem.Delete(); + } + } } } diff --git a/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01b_HelloWorldAsync.cs b/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01b_HelloWorldAsync.cs index 48fd96c08d9f6..07bb8f9cef481 100644 --- a/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01b_HelloWorldAsync.cs +++ b/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample01b_HelloWorldAsync.cs @@ -17,10 +17,157 @@ namespace Azure.Storage.Files.DataLake.Samples /// public class Sample01b_HelloWorldAsync : SampleTest { + /// + /// Create a DataLake File using a DataLakeSystem + /// + [Test] + public async Task CreateFileClientAsync_Filesystem() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Get a reference to a filesystem named "sample-filesystem-append" and then create it + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem-append")); + filesystem.Create(); + + // Create + DataLakeFileClient file = filesystem.GetFileClient(Randomize("sample-file")); + await file.CreateAsync(); + + // Verify we created one file + AsyncPageable response = filesystem.ListPathsAsync(); + IList paths = await response.ToListAsync(); + Assert.AreEqual(1, paths.Count); + + // Cleanup + await filesystem.DeleteAsync(); + } + + /// + /// Create a DataLake File using a DataLake Directory. + /// + [Test] + public async Task CreateFileClientAsync_Directory() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Create a DataLake Filesystem + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem")); + await filesystem.CreateAsync(); + + //Create a DataLake Directory + DataLakeDirectoryClient directory = filesystem.CreateDirectory(Randomize("sample-directory")); + await directory.CreateAsync(); + + // Create a DataLake File using a DataLake Directory + DataLakeFileClient file = directory.GetFileClient(Randomize("sample-file")); + await file.CreateAsync(); + + // Verify we created one file + AsyncPageable response = filesystem.ListPathsAsync(); + IList paths = await response.ToListAsync(); + Assert.AreEqual(1, paths.Count); + + // Cleanup + await filesystem.DeleteAsync(); + } + + + /// + /// Create a DataLake Directory. + /// + [Test] + public async Task CreateDirectoryClientAsync() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Get a reference to a filesystem named "sample-filesystem-append" and then create it + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem-append")); + filesystem.Create(); + + // Create + DataLakeDirectoryClient directory = filesystem.GetDirectoryClient(Randomize("sample-file")); + await directory.CreateAsync(); + + // Verify we created one directory + AsyncPageable response = filesystem.ListPathsAsync(); + IList paths = await response.ToListAsync(); + Assert.AreEqual(1, paths.Count); + + // Cleanup + await filesystem.DeleteAsync(); + } + /// /// Upload a file to a DataLake File. /// [Test] + public async Task AppendAsync_Simple() + { + // Create Sample File to read content from + string sampleFilePath = CreateTempFile(SampleFileContent); + + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Get a reference to a filesystem named "sample-filesystem-append" and then create it + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem-append")); + await filesystem.CreateAsync(); + try + { + // Create a file + DataLakeFileClient file = filesystem.GetFileClient(Randomize("sample-file")); + await file.CreateAsync(); + + // Append data to the DataLake File + await file.AppendAsync(File.OpenRead(sampleFilePath), 0); + await file.FlushAsync(SampleFileContent.Length); + + // Verify the contents of the file + PathProperties properties = await file.GetPropertiesAsync(); + Assert.AreEqual(SampleFileContent.Length, properties.ContentLength); + } + finally + { + // Clean up after the test when we're finished + await filesystem.DeleteAsync(); + } + } + + /// + /// Upload file by appending each part to a DataLake File. + /// + [Test] public async Task AppendAsync() { // Create three temporary Lorem Ipsum files on disk that we can upload @@ -402,5 +549,46 @@ public async Task RenameAsync() await filesystem.DeleteAsync(); } } + + /// + /// Get Properties on a DataLake File and a Directory + /// + [Test] + public async Task GetPropertiesAsync() + { + // Make StorageSharedKeyCredential to pass to the serviceClient + string storageAccountName = StorageAccountName; + string storageAccountKey = StorageAccountKey; + Uri serviceUri = StorageAccountBlobUri; + StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey); + + // Create DataLakeServiceClient using StorageSharedKeyCredentials + DataLakeServiceClient serviceClient = new DataLakeServiceClient(serviceUri, sharedKeyCredential); + + // Get a reference to a filesystem named "sample-filesystem-rename" and then create it + DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient(Randomize("sample-filesystem")); + await filesystem.CreateAsync(); + try + { + // Create a DataLake Directory to rename it later + DataLakeDirectoryClient directoryClient = filesystem.GetDirectoryClient(Randomize("sample-directory")); + await directoryClient.CreateAsync(); + + // Get Properties on a Directory + PathProperties directoryPathProperties = await directoryClient.GetPropertiesAsync(); + + // Create a DataLake file + DataLakeFileClient fileClient = filesystem.GetFileClient(Randomize("sample-file")); + await fileClient.CreateAsync(); + + // Get Properties on a File + PathProperties filePathProperties = await fileClient.GetPropertiesAsync(); + } + finally + { + // Clean up after the test when we're finished + await filesystem.DeleteAsync(); + } + } } }