{"id":25718,"date":"2018-10-22T18:42:55","date_gmt":"2018-10-22T14:42:55","guid":{"rendered":"https:\/\/www.msp360.com\/resources\/?page_id=25718"},"modified":"2020-09-08T14:33:35","modified_gmt":"2020-09-08T10:33:35","slug":"powershell","status":"publish","type":"page","link":"https:\/\/www.msp360.com\/resources\/documentation\/cli-api\/msp360-explorer\/powershell\/","title":{"rendered":"CloudBerry Explorer PowerShell Snap-in"},"content":{"rendered":"<p>CloudBerry Explorer offers PowerShell extension to manage file operations across Amazon Simple Storage Service (Amazon S3), Amazon Glacier and file system. Windows PowerShell is a command-line shell that helps IT professionals to easily control system and accelerate automation. It includes several system administration utilities, improved navigation of common management data such as the registry, certificate store, or WMI, etc.<!--more--><\/p>\n<p><b>What is good about PowerShell and CloudBerry Explorer Snap-in?<\/b><\/p>\n<p>PowerShell Snap-in allows using the majority of Amazon S3 functionality. You can combine CloudBerry Explorer commands with PowerShell commands. PowerShell is designed to operate with .Net objects, so you are not limited to command syntax. You can write complicated scripts with loops and conditions. You can schedule periodical tasks like data backup or cleanup.<\/p>\n<blockquote>\n<div class=\"wiki text prewrapped\">\n<div class=\"wiki quote\">PowerShell Snap-in is provided as-is. We don't offer support for this feature.<\/div>\n<\/div>\n<div id=\"id_l.I.ic.it.c.c_103_32543.t.cc.ch.commentRelatedChange\" class=\"comment-change-wrapper\"><\/div>\n<\/blockquote>\n<p><b>Supported Storage Providers:<\/b>\u00a0Amazon S3, Amazon Glacier, S3 compatible storage providers.<\/p>\n<p>This is an example of copying files from local disk to S3 bucket:<\/p>\n<p><b>Example:<\/b><\/p>\n<p>The file results.xls will be copied to the S3 bucket.<\/p>\n<pre><code>$s3 = Get-CloudS3Connection -Key $key -Secret $secret\r\n$destination = $s3 | Select-CloudFolder -path \"myBucket\/weeklyreport\"\r\n$src = Get-CloudFilesystemConnection | Select-CloudFolder \"c:\\sales\\\"\r\n$src | Copy-CloudItem $destination -filter \"results.xls\"<\/code><\/pre>\n<p>This can be scheduled for every weekend to copy files into S3 storage (for safety reason for Example).<\/p>\n<p><b>Example:<\/b><\/p>\n<p>This will copy all files and folders from c:\\workdata\\ to S3 bucket \"myBucket\". A new directory named by date like 2008_11_01 will be created.<\/p>\n<pre><code>$new_folder_format = Get-Date -uformat \"%Y_%m_%d\"\r\n$s3 = Get-CloudS3Connection -Key $key -Secret $secret\r\n$destination = $s3 | Select-CloudFolder -path \"myBucket\" | Add-CloudFolder $new_folder_format\r\n$src = Get-CloudFilesystemConnection | Select-CloudFolder -path \"c:\\workdata\\\"\r\n$src | Copy-CloudItem $destination -filter \"*\"<\/code><\/pre>\n<h2>Commands<\/h2>\n<p><b>USING SSE-C IN AMAZON S3<\/b><\/p>\n<p>You can use Server Side Encryption with Customer-provided key (SSE-C) when uploading files to Amazon S3 and manage already SSE-C encrypted S3 files.<\/p>\n<p>There are new parameters:<\/p>\n<p><b>-DestinationSseCustomerKey<\/b>\u00a0(alias:\u00a0<b>-DstSSEKey<\/b>) \u2013 defines an encryption key for copy, move or rename operation. This key is needed if you want to encrypt files with SSE-C.<\/p>\n<p><b>-SourceSseCustomerKey<\/b>\u00a0(alias:\u00a0<b>-SrcSSEKey<\/b>) \u2013 defines an encryption key to download from Amazon S3 or edit file\u2019s settings for file(s) encrypted with SSE-C.<\/p>\n<p><b>Note:<\/b>\u00a0for the operations such as \"local to S3\" and \"S3 to local\" you need to specify only one key: -DstSSEKey for upload; -SrcSSEKey for download. For the operations such as \u201cS3 to S3\u201d or rename on S3 you can use two keys and they can be different \u2013 it allows you to modify the SSE-C key for already encrypted files.<\/p>\n<p>These parameters were added to the following commands:<\/p>\n<p>Copy-CloudItem<br \/>\nMove-CloudItem<br \/>\nRename-CloudItem<br \/>\nSet-CloudItemStorageClass (backward compatibility: Set-CloudStorageClass)<br \/>\nAdd-CloudItemHeaders<br \/>\nGet-CloudItemHeaders<\/p>\n<p>There is a new command:<\/p>\n<p><b>Set-CloudItemServerSideEncryption\u00a0<\/b>\u2013 allows to set or change the SSE settings for existing S3 file (e.g. set\/reset SSE-C encryption; reset any SSE encryption; switch SSE to SSE-C, or vice versa)<\/p>\n<p><b>Example: Upload to Amazon S3 with SSE-C<\/b><\/p>\n<p>1. Generate a 256-bit encryption key (256-bit key for AES-256) \u2013 this example demonstrates key generation using password-based key derivation functionality PBKDF2.<\/p>\n<pre><code>$iterations = 100000\r\n$salt = [byte[]] (1,2,3,4,5,6,7,8)\r\n<span style=\"color: #ff5959;\">$password = \"My$Super9Password\"<\/span>\r\n$binaryKey=(New-Object System.Security.Cryptography.Rfc2898DeriveBytes([System.Text.Encoding]::UTF8.GetBytes($password), $salt, $iterations)).GetBytes(32)<\/code><\/pre>\n<p><span style=\"color: #ff5959;\">IMPORTANT NOTE: $password is just an example value. Make sure to use your character sequence.<\/span><\/p>\n<p>2. Copy data from local to Amazon S3 with SSE-C using generated key:<br \/>\n<code>$source | Copy-CloudItem $dest\u00a0<b>-DstSSEkey $binaryKey<\/b>\u00a0-filter *<\/code><br \/>\nwhere $source is a local folder, $dest is Amazon S3 bucket (folder). For example:<\/p>\n<pre><code>$source = Get-CloudFilesystemConnection | Select-CloudFolder \"C:\\Company\\DailyReports\"\r\n$s3 = Get-CloudS3Connection -k yourAccessKey -s yourSecretKey\r\n$dest = $s3 | Select-CloudItem \"mycompany\/reports\"<\/code><\/pre>\n<p><b>Example: Download SSE-C encrypted file from Amazon S3<\/b><br \/>\n<code>$dest | Copy-CloudItem $source\u00a0<b>-SrcSSEKey $binaryKey<\/b>\u00a0-filter \"monthlyReport-Jul2014.docx\"<\/code><br \/>\nTo move files, just replace Copy-CloudItem with\u00a0<b>Move-CloudItem.<\/b><\/p>\n<p><b>Example: Rename existing SSE-C encrypted file with keeping encryption with the same key<\/b><br \/>\n<code>$dest | Rename-CloudItem \u2013name \"monthlyReport-Jul2014.docx\" -newname \"monthlyReport-Aug2014.docx\"\u00a0<b>-SrcSSEKey $binaryKey -DstSSEKey $binaryKey<\/b><\/code><br \/>\n<b>Example: Copy existing SSE-C encrypted file inside S3 with keeping encryption with the same key<\/b><br \/>\n<code>$dest | Copy-CloudItem $dest2 -filter \u201cmonthlyReport-Jul2014.docx\u201d\u00a0<b>-SrcSSEkey $binaryKey -DstSSEkey $binaryKey<\/b><\/code><br \/>\n<b>Example: Set or change SSE-C encryption for existing S3 file<\/b><\/p>\n<p><u>Encrypt non-encrypted S3 file with SSE-C<\/u><br \/>\n<code>$dest | Set-CloudItemServerSideEncryption -filter \u201cmonthlyReport-May2014.docx\u201d\u00a0<b>-DstSSEkey $binaryKey<\/b><\/code><br \/>\n<u>Encrypt non-encrypted S3 file with SSE<\/u><br \/>\n<code>$dest | Set-CloudItemServerSideEncryption -filter \u201cmonthlyReport-Apr2014.docx\u201d\u00a0<b>-SSE<\/b><\/code><br \/>\n<u>Decrypt SSE-C encrypted S3 file (i.e. reset SSE-C)<\/u><br \/>\n<code>$dest | Set-CloudItemServerSideEncryption -filter \u201cmonthlyReport-May2014.docx\u201d\u00a0<b>-SrcSSEKey $binaryKey<\/b><\/code><br \/>\n<u>Reset SSE encryption for S3 file<\/u><br \/>\n<code>$dest | Set-CloudItemServerSideEncryption -filter \u201cmonthlyReport-Apr2014.docx\u201d\u00a0<b>-SSE:$false<\/b><\/code><br \/>\n<b>Example: Change Storage Class for SSE-C encrypted file<\/b><br \/>\n<code>$dest | Set-CloudItemStorageClass -filter \u201cmonthlyReport-May2014.docx\u201d\u00a0<b>-SrcSSEKey $binaryKey<\/b><\/code><br \/>\n<b>UPLOAD TO AMAZON GLACIER<\/b><\/p>\n<p>You can set the connection to your Amazon Glacier account, set connection options, upload files to Amazon Glacier and set filters for files to upload. Also, you can restore data from Amazon Glacier using PowerShell commands. Check out the Examples below:<\/p>\n<p><b>Example: Uploading to Amazon Glacier<\/b><\/p>\n<p># Add snap-in<br \/>\n<code>add-pssnapin CloudBerryLab.Explorer.PSSnapIn<\/code><br \/>\n# Enable logging and specify the path<br \/>\n<code>Set-Logging -LogPath \"C:\\Users\\user1\\AppData\\Local\\CloudBerry S3 Explorer PRO\\Logs\\PowerShell.log\" -LogLevel Info<\/code><br \/>\n# Create connection<br \/>\n<code>$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]<\/code><br \/>\n# Set options<\/p>\n<pre><code>Set-CloudOption -GlacierRetrievalRateLimitType Specified\r\nSet-CloudOption -GlacierChunkSizeMB 4\r\nSet-CloudOption -GlacierParallelUpload 1\r\nSet-CloudOption -GlacierPeakRetrievalRateLimit 23.5<\/code><\/pre>\n<p># Select vault<br \/>\n<code>$vault = $conn | Select-CloudFolder -Path \"us-east-1\/[YOUR VAULT]\"<\/code><br \/>\n# Let's copy to vault<br \/>\n<code>$destination = $vault<\/code><br \/>\n# Select source folder<br \/>\n<code>$src = Get-CloudFilesystemConnection | Select-CloudFolder \"C:\\Tmp[YOUR SOURCE FOLDER PATH]\"<\/code><br \/>\n# Upload files to Glacier by the filter<br \/>\n<code>#$src | Copy-CloudItem $destination -filter \"sample.txt\"<\/code><br \/>\n# Upload all files to Glacier<br \/>\n<code>$src | Copy-CloudItem $destination -filter \"*\"<\/code><br \/>\n# Delete vault<br \/>\n<code>$conn | Remove-CloudBucket $vault<\/code><br \/>\n<b>Example: Retrieving data from Amazon Glacier<\/b><\/p>\n<p># Add snap-in<br \/>\n<code>add-pssnapin CloudBerryLab.Explorer.PSSnapIn<\/code><br \/>\n# Enable logging and specify the path<br \/>\n<code>Set-Logging -LogPath \"C:\\Users\\user1\\AppData\\Local\\CloudBerry S3 Explorer PRO\\Logs\\PowerShell.log\" -LogLevel Info<\/code><br \/>\n# Create connection<br \/>\n<code>$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]<\/code><br \/>\n# Get existing vault<br \/>\n<code>$vault = $conn | Select-CloudFolder -Path \"us-east-1\/[YOUR VAULT]\"<\/code><br \/>\n# Get vault inventory.<\/p>\n<p><b>Note:<\/b>\u00a0this command may take up to 5 hours to execute if inventory has not been prepared yet.<br \/>\n<code>$invJob = $vault | Get-Inventory<\/code><br \/>\n# Now read vault archives<br \/>\n<code>$archives = $vault | get-clouditem<\/code><br \/>\n# Select destination local folder.<br \/>\n<code>$dst = Get-CloudFilesystemConnection | Select-CloudFolder \"C:\\Tmp [YOUR DESTINATION FOLDER PATH]\"<\/code><br \/>\n# Copy files from vault. Only files located in C:\\Tmp folder are copied.<\/p>\n<p><b>Note:<\/b>\u00a0this command may take many hours to execute when files have not been prepared for copying yet.<br \/>\n<code>$vault | Copy-CloudItem $dst -filter \"C:\\Tmp\\*.*\"<\/code><br \/>\n<b>ENABLING SERVER SIDE ENCRYPTION<\/b><\/p>\n<p>SSE is enabled with \"-sse\" switch. Applicable for Copy-CloudItem and Copy-CloudSyncFolders commands when uploading to Amazon S3.<\/p>\n<p><b>Example: Enabling SSE for Copy-CloudItem<\/b>:<br \/>\n<code>source | Copy-CloudItem $dest -Filter *.mov\u00a0<b>-sse<\/b><\/code><br \/>\n<b>Example: Enabling SSE for Copy-CloudSyncFolders<\/b>:<br \/>\n<code>$src | Copy-CloudSyncFolders $destination -IncludeFiles \"*.jpg\"\u00a0<b>-sse<\/b><\/code><br \/>\n<b>Example: Enable SSL for connection<\/b>:<\/p>\n<p># Create a connection with SSL<br \/>\n<code>$s3 = Get-CloudS3Connection -UseSSL -Key $key -Secret $secret<\/code><br \/>\n<b>Options supported for Copy-CloudSyncFolders<\/b>:<\/p>\n<p><b>-StorageClass<\/b>\u00a0defines storage class for files (it can be rrs or standard, or standard_ia)<\/p>\n<p><b>-IncludeFiles<\/b>\u00a0allows to specify certain files for sync using the standard wildcards (for example: *.exe; *.dll; d*t.doc; *.t?t)<\/p>\n<p><b>-ExcludeFiles<\/b>\u00a0allows excluding certain files from sync using the standard wildcards (for Example *.exe; *.dll; d*t.doc; *.t?t)<\/p>\n<p><b>-ExcludeFolders<\/b>\u00a0allows skipping certain folders (for Example bin; *temp*; My*)<\/p>\n<p><b>Example: Sync only JPG files and setting RRS storage class while syncing the files to the S3 storage<\/b><\/p>\n<pre><code>$s3 = Get-CloudS3Connection -Key $key -Secret $secret\r\n$destination = $s3 | Select-CloudFolder -Path \"myBucket\/weeklyreport\"\r\n$src = Get-CloudFilesystemConnection | Select-CloudFolder \"c:\\sales\\\"\r\n$src | Copy-CloudSyncFolders $destination -IncludeFiles \"*.jpg\" -StorageClass rrs<\/code><\/pre>\n<p><b>Example: Sync entire folder excluding \\temp folder and .tmp files<\/b><\/p>\n<pre><code>$s3 = Get-CloudS3Connection -Key $key -Secret $secret\r\n$destination = $s3 | Select-CloudFolder -Path \"myBucket\/weeklyreport\"\r\n$src = Get-CloudFilesystemConnection | Select-CloudFolder \"c:\\sales\\\"\r\n$src | Copy-CloudSyncFolders $destination -IncludeSubfolders -ExcludeFiles \"*.tmp\" -ExcludeFolders \"temp\"<\/code><\/pre>\n<p><b>SETTING A STORAGE CLASS<\/b><\/p>\n<p>You can set a storage class for a certain file or several files:<br \/>\n<code>Set-CloudStorageClass<\/code><br \/>\n<b>Storage Class:<\/b>\u00a0rrs, standard, standard_ia<\/p>\n<p><b>Example: Setting an RRS storage class to a specified item:<\/b><\/p>\n<pre><code>$s3 = Get-CloudS3Connection -Key $key -Secret $secret\r\n$bucket = $s3 | Select-CloudFolder -Path $bucketname\r\n$item = $bucket | Get-CloudItem $itemname\r\n$item | Set-CloudStorageClass -StorageClass rrs<\/code><\/pre>\n<p><b>Example: Setting RRS storage class to all text files in a specified folder:<\/b><\/p>\n<pre><code>$s3 = Get-CloudS3Connection -Key $key -Secret $secret\r\n$bucket = $s3 | Select-CloudFolder -Path $bucketname\r\n$folder = $bucket | Get-CloudItem $foldername\r\n$folder | Set-CloudStorageClass -Filter *.txt -StorageClass rrs<\/code><\/pre>\n<p>Or you can set storage class while copying files to S3 storage -StorageClass in\u00a0<b>Copy-CloudItem<\/b>.<\/p>\n<p><b>Example: Setting RRS storage class to a file while uploading it to the S3 storage:<\/b><\/p>\n<pre><code>$s3 = Get-CloudS3Connection -Key $key -Secret $secret\r\n$destination = $s3 | Select-CloudFolder -path \"myBucket\/weeklyreport\"\r\n$src = Get-CloudFilesystemConnection | Select-CloudFolder \"c:\\sales\\\"\r\n$src | Copy-CloudItem $destination -filter \"results.xls\" -StorageClass rrs<\/code><\/pre>\n<p><b>ADVANCED PARAMETERS FOR \"Copy-CloudSyncFolders\"<\/b><\/p>\n<p>Copy-CloudSyncFolders supports advanced parameters:<\/p>\n<p><b>-DeleteOnTarget<\/b>\u00a0delete files from the target if they no longer exist on the source<\/p>\n<p><b>-IncludeSubfolders<\/b>\u00a0include subfolders into synchronization<\/p>\n<p><b>-CompareByContent<\/b>\u00a0use MD5 hash to compare the content of files (PRO only)<\/p>\n<p><b>-MissingOnly<\/b>\u00a0copy only missing files, ignore files that exist both on source and target<\/p>\n<p><b>GENERATING WEB URLs<\/b><\/p>\n<p>Using Get-CloudUrl you can generate HTTP, HTTPS or RTMP URLs and also HTML code for streaming video files.<\/p>\n<p><b>Example: Generating short URL for JPG files and save the output to a file<\/b><br \/>\n<code>$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -ChilpIt &gt;&gt; C:\\urls.txt<\/code><br \/>\n<b>Example: Generating signed URL<\/b><br \/>\n<code>$dest | Get-CloudUrl -Filter *.jpg -Type HTTPS -Expire 01\/01\/2011 &gt;&gt; C:\\urls.txt<\/code><br \/>\n<b>Example: Generating CloudFront signed URL (where $domain is a CloudFront distribution domain name)<\/b><br \/>\n<code>$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -Expire 01\/01\/2011 -DomainName $domain&gt;&gt; C:\\urls.txt<\/code><br \/>\n<b>Example: Generate signed URL for the private content item (where $domain is Streaming distribution domain name)<\/b><\/p>\n<pre><code>$policy = New-CloudPolicy -PrivateKey $privatekey -KeyPairId $keypairid -IsCanned\r\n$dest | Get-CloudUrl -Filter *.flv -Type RTMP -Policy $policy -Expire 01\/01\/2011 -DomainName $domain &gt;&gt; C:\\urls.txt<\/code><\/pre>\n<p><b>SETTING CUSTOM CONTENT TYPES AND HTTP HEADERS<\/b><\/p>\n<p><b>Example: Adding a new content type for .flv<\/b><\/p>\n<pre><code>Add-CloudContentType -Extension .flv -Type video\/x-flv\r\nGet-CloudContentTypes\u00a0- displays a list of predefined and custom content types<\/code><\/pre>\n<p>Any file with .flv extension uploaded to S3 will have a proper content-type: video\/x-flv.<\/p>\n<p><b>Example: Getting HTTP headers for an item ($s3 is an S3 connection)<\/b><br \/>\n<code>$s3 | Select-CloudFolder myvideos | Get-CloudItem cats.flv | Get-CloudItemHeaders<\/code><br \/>\n<b>Example: Setting HTTP headers to items<\/b><\/p>\n<pre><code>$headers = New-CloudHeaders Expires \"Thu, 1 Apr 12:00:00 GMT\"\r\n$s3 | Select-CloudFolder myvideos | Add-CloudItemHeaders -Filter *.flv -Headers $headers<\/code><\/pre>\n<p><b>Example: Setting HTTP headers when copy\/move<\/b><\/p>\n<pre><code>$headers = New-CloudHeaders Cache-Control private\r\n$source | Copy-CloudItem $dest -Filter *.mov -Headers $headers<\/code><\/pre>\n<p><b>RENAMING ITEMS<\/b><\/p>\n<p><b>Example: Renaming folder \"favorites\" to \"thrillers\" that is located in the bucket \"my videos\"<\/b><br \/>\n<code>$s3 | Select-CloudFolder myvidoes | Rename-CloudItem -Name favourites -NewName thrillers<\/code><br \/>\n<b>APPLY ACL FOR ALL SUBFOLDERS AND FILES<\/b><\/p>\n<p><b>Example: Make all files inside \"myvideos\/thrillers\" and its subfolders as public read<\/b><br \/>\n<code>$s3 | Select-CloudFolder myvideos\/thrillers | Add-CloudItemPermission&lt; -UserName \"All Users\" -Read -Descendants<\/code><br \/>\n<b>SET LOGGING FOR POWERSHELL<\/b><br \/>\n<code>Set-Logging -LogPath &lt;path&gt; -LogLevel &lt;value&gt;<\/code><br \/>\n<b>Values:<\/b>\u00a0nolog, fatal, error, warning, info, debug<\/p>\n<p><b>ADVANCED OPTIONS (PRO ONLY)<\/b><br \/>\n<code>Set-CloudOption -ThreadCount &lt;number&gt;<\/code><br \/>\nDefines count of threads for multithreading uploading\/downloading.<br \/>\n<code>Set-CloudOption -UseCompression &lt;value&gt;<\/code><br \/>\nDefines whether to use compression or not.<br \/>\n<code>Set-CloudOption -UseChunks &lt;value&gt; -ChunkSizeKB &lt;sizeinKB&gt;<\/code><br \/>\nDefines the size of the chunk in KB; files larger than a chunk will be divided into chunks.<\/p>\n<p><b>Values:<\/b>\u00a01 or 0<\/p>\n<p>If you want to download a file that was divided into chunks on S3 storage you should enable \"chunk transparency\" mode before downloading the file to download it as a single file:<br \/>\n<code>Set-CloudOption -ChunkTransparency 1<\/code><br \/>\nWhen you copy or move files to S3 these files can inherit ACL from parent object: bucket or folder.<br \/>\n<code>Set-CloudOption -PermissionsInheritance &lt;value&gt;<\/code><br \/>\n<b>Values:<\/b>\u00a0\"donotinherit\", \"onlyforcloudfront\", \"inheritall\"<\/p>\n<p><b>Example:<\/b><\/p>\n<pre><code>Set-CloudOption -PermissionsInheritance \"inheritall\"\r\n$s3 = Get-CloudS3Connection &lt;key&gt; &lt;secret&gt;\r\n$destination = $s3 | Select-CloudFolder -path \"myBucket\/weeklyreport\"\r\n$src = Get-CloudFilesystemConnection | Select-CloudFolder \"c:\\sales\\\"\r\n$src | Copy-CloudItem $destination -filter \"results.xls\"<\/code><\/pre>\n<p>The file \"result.xls\" will automatically have the same ACL as \"myBucket\/weeklyreport\".<br \/>\n<code>Set-CloudOption -KeepExistingHeaders<\/code><br \/>\nKeep existing HTTP headers when replacing files on S3.<br \/>\n<code>Set-CloudOption -DoNotChangePermissionsForExisting &lt;value&gt;<\/code><br \/>\nKeep ACL for files when replacing them on S3.<\/p>\n<p><b>Values:<\/b>\u00a01 or 0<br \/>\n<code>Set-CloudOption -KeepExistingPemissionsOnCloudCopy &lt;value&gt;<\/code><br \/>\nKeep\u00a0<b>source<\/b>\u00a0permissions when copying\u00a0<b>within S3<\/b>.<\/p>\n<p><b>Values:<\/b>\u00a01 or 0<\/p>\n<p><b>Copy-CloudSyncFolders<\/b><\/p>\n<p>Copy-CloudSyncFolders synchronizes local folders with the Amazon S3 bucket. You should specify the source folder (local or S3) in the pipeline.<\/p>\n<p><b>-Source<\/b>\u00a0&lt;CloudFolder&gt; Amazon S3 bucket or folder or local folder<\/p>\n<p><b>-Target<\/b>\u00a0&lt;CloudFolder&gt; Amazon S3 bucket or folder or local folder<\/p>\n<p><b>Example:<\/b><\/p>\n<pre><code>$s3 = Get-CloudS3Connection &lt;key&gt; &lt;secret&gt;\r\n$source = $s3 | Select-CloudFolder -Path boooks\/sync\r\n$local = Get-CloudFileSystemConnection\r\n$target = $local | Select-CloudFolder C:\\temp\\sync\r\n$source | Copy-CloudSyncFolders $target<\/code><\/pre>\n<p>Or synchronize content in both ways.<br \/>\n<code>$source | Copy-CloudSyncFolders $target -Bidirectional<\/code><br \/>\n<b>New-CloudBucket<\/b><\/p>\n<p>New-CloudBucket Creates a new bucket. You should specify S3 connection in pipeline.<\/p>\n<p><b>-Connection<\/b>\u00a0&lt;CloudS3Connection&gt; - S3 connection<\/p>\n<p><b>-Name<\/b>\u00a0&lt;String&gt; - Bucket name<\/p>\n<p><b>-Location<\/b>\u00a0&lt;String&gt; - Bucket location. US (USA) or EU (europe). By default US location is used.<\/p>\n<p><b>Example:<\/b><\/p>\n<pre><code>$s3 = Get-CloudS3Connection &lt;key&gt; &lt;secret&gt;\r\n$s3 | New-CloudBucket mytestbucket EU Remove-CloudBucket<\/code><\/pre>\n<p>Removes bucket. Before bucket will be removed all contents must be removed. It can take a long time, progress is displayed.<\/p>\n<p>-<b>Connection<\/b>\u00a0&lt;CloudS3Connection&gt; S3 Connection<\/p>\n<p>-<b>Name<\/b>\u00a0&lt;String&gt; Bucket name<\/p>\n<p>-<b>Force<\/b>\u00a0Suppress warning messages<\/p>\n<p>-<b>Bucket<\/b>\u00a0&lt;CloudFolder&gt; Bucket object<\/p>\n<p><b>Example:<\/b><br \/>\n<code>$s3 | Remove-CloudBucket mytestbucket<\/code><br \/>\n<b>Get-CloudItemACL<\/b>\u00a0- Returns all access control entry for the specified item. It can be an S3 bucket, folder or file. You can get the item using Select-CloudFolder or Get-CloudItems commands.<\/p>\n<p><b>-Item<\/b>\u00a0&lt;CloudItem&gt; Cloud item, it can be a bucket, s3 folder or s3 file.<\/p>\n<p><b>Example:<\/b><br \/>\n<code>$fld = $s3 | Select-CloudFolder mytestbucket\/documents\u00a0$fld | Get-CloudItemACL<\/code><br \/>\n<b>Add-CloudItemPermission\u00a0<\/b>- Grants permission to user or group. If the user is not in the ACL, user entry will be added.<\/p>\n<p><b>-Item &lt;CloudItem&gt;<\/b>\u00a0Cloud item, it can be a bucket, s3 folder or s3 file.<\/p>\n<p><b>-UserName &lt;String&gt;<\/b>\u00a0Username or group<\/p>\n<p><b>-Write<\/b>\u00a0Grant write permission<\/p>\n<p><b>-WriteACP<\/b>\u00a0Grant write ACP permission<\/p>\n<p><b>-Read<\/b>\u00a0Grant read permission<\/p>\n<p><b>-ReadACP<\/b>\u00a0Grant read ACP permission<\/p>\n<p><b>-FullControl<\/b>\u00a0Grant full control permission. This means that all other permits will be granted.<\/p>\n<p><b>-CloudACE &lt;CloudACE&gt;<\/b>\u00a0Access control entry<\/p>\n<p><b>Example:<\/b><br \/>\n<code>$fld | Add-CloudItemPermission \"All Users\" -Read<\/code><br \/>\n<b>Remove-CloudItemPermission\u00a0<\/b>- Revokes permission to user or group. If the RemoveUser parameter is specified, user entry will be removed from the access control list.<\/p>\n<p><b>-Item &lt;CloudItem&gt;<\/b>\u00a0Cloud item, it can be a bucket, s3 folder or s3 file.<\/p>\n<p><b>-UserName &lt;String&gt;<\/b>\u00a0Username or group<\/p>\n<p><b>-Write<\/b>\u00a0Revoke write permission<\/p>\n<p><b>-WriteACP<\/b>\u00a0Revoke write ACP permission<\/p>\n<p><b>-Read<\/b>\u00a0Revoke read permission<\/p>\n<p><b>-ReadACP<\/b>\u00a0Revoke read ACP permission<\/p>\n<p><b>-FullControl<\/b>\u00a0Revoke full control permission. This means that all other permits will be removed.<\/p>\n<p><b>-CloudACE &lt;CloudACE&gt;<\/b>\u00a0Access control entry<\/p>\n<p><b>Example:<\/b><br \/>\n<code>$fld | Remove-CloudItemPermission \"All Users\" -Read<\/code><br \/>\n<b>Set-CloudOption<\/b>\u00a0- Set options for snap-in<\/p>\n<p><b>-PathStyle &lt;String&gt;<\/b>\u00a0- Path style if this flag is specified. VHost otherwise.<\/p>\n<p><b>-ProxyAddress &lt;String&gt;<\/b>\u00a0- Proxy address<\/p>\n<p><b>-ProxyPort &lt;Int32&gt;<\/b>\u00a0- Proxy port<\/p>\n<p><b>-ProxyUser &lt;String&gt;<\/b>\u00a0- Proxy user name<\/p>\n<p><b>-ProxyPassword &lt;String&gt;<\/b>\u00a0- Proxy user password<\/p>\n<p><b>-CheckFileConsistency<\/b>\u00a0- Check file consistency. The MD5 hash is used for checking.<\/p>\n<h2>Other Commands<\/h2>\n<p><b>Add-CloudFolder<\/b>\u00a0- Create new folder<\/p>\n<p><code>-Folder &lt;CloudFolder&gt;<\/code>\u00a0- Current folder<br \/>\n<code>-Name &lt;String&gt;<\/code>\u00a0- New folder name<\/p>\n<p><b>Copy-CloudItem<\/b>\u00a0- Copy cloud item (file or folder) to the Destination<\/p>\n<p><code>-Destination &lt;CloudFolder&gt;<\/code>\u00a0- Destination folder<br \/>\n<code>-Filter &lt;String&gt;<\/code>\u00a0- Item filter, * and ? are permitted<br \/>\n<code>-Folder &lt;CloudFolder&gt;<\/code>\u00a0- Current folder<\/p>\n<p><b>Get-CloudFilesystemConnection<\/b>\u00a0- Get connection to local file system<\/p>\n<p><b>Get-CloudItem<\/b>\u00a0- List files and folder in current folder<\/p>\n<p><code>-Filter &lt;String&gt;<\/code>\u00a0- Item filter, * and ? are permitted<br \/>\n<code>-Folder &lt;CloudFolder&gt;<\/code>\u00a0- Current folder<\/p>\n<p><b>Get-CloudRootFolder<\/b>\u00a0- Get root folders<\/p>\n<p><code>-Connection &lt;BaseCloudConnection&gt;<\/code>\u00a0- Connection object<\/p>\n<p><b>Get-CloudS3Connection<\/b>\u00a0- Set Amazon S3 connection<\/p>\n<p><code>-Key &lt;String&gt;<\/code>\u00a0- Access Key for S3 connection<br \/>\n<code>-Secret &lt;String&gt;<\/code>\u00a0- Secret Key for S3 connection<br \/>\n<code>-Endpoint &lt;String&gt;<\/code>\u00a0- Endpoint for S3 compatible storage<br \/>\n<code>-UseSSL<\/code>\u00a0- Enables SSL for connection<br \/>\n<code>-SignatureVersion<\/code>\u00a0- Defines an authentication version. By default, 4.<\/p>\n<p>Note: to define S3 compatible connection you need to specify Signature Version = 2<br \/>\n<code>$s3 = Get-CloudS3Connection -Key $key -Secret $secret -SignatureVersion 2<\/code><\/p>\n<p><b>Get-CloudGlacierConnection<\/b>\u00a0- Set Amazon Glacier connection<\/p>\n<p><code>-Key &lt;String&gt;<\/code>\u00a0- Access Key for Amazon Glacier connection<br \/>\n<code>-Secret &lt;String&gt;<\/code>\u00a0- Secret Key for Amazon Glacier connection<br \/>\n<code>-UseSSL<\/code>\u00a0- Enables SSL for connection<\/p>\n<p><b>Move-CloudItem<\/b>\u00a0- Move cloud item (file or folder) to the Destination<\/p>\n<p><code>-Destination &lt;CloudFolder&gt;<\/code>\u00a0- Destination folder<br \/>\n<code>-Filter &lt;String&gt;<\/code>\u00a0- Item filter, * and ? are permitted<br \/>\n<code>-Folder &lt;Folder&gt;<\/code>\u00a0- Current folder<\/p>\n<p><b>Remove-CloudItem<\/b>\u00a0- Remove cloud items (file or folder)<\/p>\n<p><code>-Filter &lt;String&gt;<\/code>\u00a0- Item filter, * and ? are permitted<br \/>\n<code>-Folder &lt;CloudFolder&gt;<\/code>\u00a0- Current folder<\/p>\n<p><b>Select-CloudFolder<\/b>\u00a0- Get a cloud folder. It must be used for getting a folder for other commands as the current folder.<\/p>\n<p><code>-Connection &lt;BaseCloudConnection&gt;<\/code>\u00a0- Connection object<br \/>\n<code>-Path &lt;String&gt;<\/code>\u00a0- Path<br \/>\n<code>-Folder &lt;CloudFolder&gt;<\/code>\u00a0- Folder object<\/p>\n<h2>Installation<\/h2>\n<p>Powershell Snap-In must be registered and added to the console.<\/p>\n<p><b>System Requirements<\/b><\/p>\n<p>.NET Framework 4.0 (full version)<br \/>\nWindows Management Framework 3.0<\/p>\n<p><b>Registering Snap-In<\/b><\/p>\n<p>If the PowerShell is installed before the installation of CloudBerry Explorer, you do not need to install Snap-in. Otherwise, run the following command in the CloudBerry Explorer installation folder (c:\\Program Files\\CloudBerryLab\\CloudBerry Explorer for Amazon S3):<br \/>\nC:\\Windows\\Microsoft.NET\\Framework\\v4.0.30319\\InstallUtil.exe CloudBerryLab.Explorer.PSSnapIn.dll<\/p>\n<p><b>Note:<\/b>\u00a0For x64 the command must be like : C:\\Windows\\Microsoft.NET\\Framework64\\v4.0.30319\\InstallUtil.exe \"C:\\Program Files (x86)\\CloudBerryLab\\CloudBerry Explorer for Amazon S3\\CloudBerryLab.Explorer.PSSnapIn.dll\"<\/p>\n<p><b>Note:<\/b>\u00a0For PRO version the default installation folder is \"C:\\Program Files\\CloudBerryLab\\CloudBerry S3 Explorer PRO\"; on x64 - \"C:\\Program Files (x86)\\CloudBerryLab\\CloudBerry S3 Explorer PRO\"<\/p>\n<p><b>Note:<\/b>\u00a0You can do this from the command line or PowerShell.<\/p>\n<p>You can verify that the CloudBerry Explorer Snap-in is registered. Run the following command:<\/p>\n<p><code>Get-PSsnapin -Registered<\/code><\/p>\n<p>PowerShell displays registered Snap-Ins. Check that CloudBerryLab.Explorer.PSSnapIn is on the list.<\/p>\n<p><b>Adding Snap-In to console<\/b><\/p>\n<p>You can check that CloudBerry Explorer Snap-in is registered by the running command above.<br \/>\nTo add Snap-In to console run the following command:<\/p>\n<p><code>Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn<\/code><\/p>\n<p>Now the new command will be available.<\/p>\n<p><b>Exporting console configuration<\/b><\/p>\n<p>You should run the Add-PSSnapin command anytime you start PowerShell or you can save the configuration using the following.<\/p>\n<ul>\n<li>Run PowerShell.<\/li>\n<li>Add Snap-In to console.<\/li>\n<li>Run the command: Export-Console CloudBerryExplorerConfig<\/li>\n<\/ul>\n<p>CloudBerryExplorerConfig is the name of a console file to save the configuration. To start the PowerShell from a saved configuration run the command:<br \/>\nC:\\Program Files\\Command Shell&gt; PS -PSConsoleFile CloudBerryExplorerConfig.psc1.<br \/>\nCloudBerry Explorer commands will be available.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>CloudBerry Explorer offers PowerShell extension to manage file operations across Amazon Simple Storage Service (Amazon S3), Amazon Glacier and file system. Windows PowerShell is a command-line shell that helps IT professionals to easily control system and accelerate automation. It includes several system administration utilities, improved navigation of common management data such as the registry, certificate [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"parent":25715,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"page-template-cli.php","meta":{"_acf_changed":false,"footnotes":""},"class_list":["post-25718","page","type-page","status-publish","hentry","types-cli"],"acf":[],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/pages\/25718","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/comments?post=25718"}],"version-history":[{"count":0,"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/pages\/25718\/revisions"}],"up":[{"embeddable":true,"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/pages\/25715"}],"wp:attachment":[{"href":"https:\/\/www.msp360.com\/resources\/wp-json\/wp\/v2\/media?parent=25718"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}