Files
impala/tests/util/adls_util.py
Sahil Takiar ac87278b16 IMPALA-8950: Add -d, -f options to hdfs copyFromLocal, put, cp
Add the -d option and -f option to the following commands:

`hdfs dfs -copyFromLocal <localsrc> URI`
`hdfs dfs -put [ - | <localsrc1> .. ]. <dst>`
`hdfs dfs -cp URI [URI ...] <dest>`

The -d option "Skip[s] creation of temporary file with the suffix
._COPYING_." which improves performance of these commands on S3 since S3
does not support metadata only renames.

The -f option "Overwrites the destination if it already exists" combined
with HADOOP-13884 this improves issues seen with S3 consistency issues by
avoiding a HEAD request to check if the destination file exists or not.

Added the method 'copy_from_local' to the BaseFilesystem class.
Re-factored most usages of the aforementioned HDFS commands to use
the filesystem_client. Some usages were not appropriate / worth
refactoring, so occasionally this patch just adds the '-d' and '-f'
options explicitly. All calls to '-put' were replaced with
'copyFromLocal' because they both copy files from the local fs to a HDFS
compatible target fs.

Since WebHDFS does not have good support for copying files, this patch
removes the copy functionality from the PyWebHdfsClientWithChmod.
Re-factored the hdfs_client so that it uses a DelegatingHdfsClient
that delegates to either the HadoopFsCommandLineClient or
PyWebHdfsClientWithChmod.

Testing:
* Ran core tests on HDFS and S3

Change-Id: I0d45db1c00554e6fb6bcc0b552596d86d4e30144
Reviewed-on: http://gerrit.cloudera.org:8080/14311
Reviewed-by: Impala Public Jenkins <impala-public-jenkins@cloudera.com>
Tested-by: Impala Public Jenkins <impala-public-jenkins@cloudera.com>
2019-10-05 00:04:08 +00:00

78 lines
2.8 KiB
Python

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# ADLS access utilities
#
# This file uses the azure-data-lake-store-python client and provides simple
# functions to the Impala test suite to access Azure Data Lake Store.
from azure.datalake.store import core, lib, multithread, exceptions
from tests.util.filesystem_base import BaseFilesystem
from tests.util.filesystem_utils import ADLS_CLIENT_ID, ADLS_TENANT_ID, ADLS_CLIENT_SECRET
from tests.util.hdfs_util import HadoopFsCommandLineClient
class ADLSClient(BaseFilesystem):
def __init__(self, store):
self.token = lib.auth(tenant_id = ADLS_TENANT_ID,
client_secret = ADLS_CLIENT_SECRET,
client_id = ADLS_CLIENT_ID)
self.adlsclient = core.AzureDLFileSystem(self.token, store_name=store)
self.adls_cli_client = HadoopFsCommandLineClient("ADLS")
def create_file(self, path, file_data, overwrite=True):
if not overwrite and self.exists(path): return False
with self.adlsclient.open(path, 'wb') as f:
num_bytes = f.write(file_data)
assert num_bytes == len(file_data), "ADLS write failed."
return True
def make_dir(self, path, permission=None):
self.adlsclient.mkdir(path)
return True
def copy(self, src, dst, overwrite=True):
self.adls_cli_client.copy(src, dst, overwrite)
def copy_from_local(self, src, dst):
self.adls_cli_client.copy_from_local(src, dst)
def ls(self, path):
file_paths = self.adlsclient.ls(path)
files= []
for f in file_paths:
fname = f.split("/")[-1]
if not fname == '':
files += [fname]
return files
def exists(self, path):
return self.adlsclient.exists(path)
def delete_file_dir(self, path, recursive=False):
try:
self.adlsclient.rm(path, recursive)
except exceptions.FileNotFoundError as e:
return False
return True
def get_all_file_sizes(self, path):
"""Returns a list of integers which are all the file sizes of files found under
'path'."""
return [self.adlsclient.info(f)['length'] for f in self.adlsclient.ls(path) \
if self.adlsclient.info(f)['type'] == 'FILE']