* checkout files from test branch * read_incremental works * reset to master * remove dead code * comment * fix * Add test * comments * utc * format * small fix * Add test with rfc3339 * remove unused param * fix test * configurable state checkpointing * update test * start working on retrier * retry predicate * return response status * look in error message * cleanup test * constant backoff strategy * chain backoff strategy * chain retrier * Add to class types registry * extract backoff time from header * wait until * update * split file * parse_records * classmethod * delete dead code * comment * comment * comments * fix * test for instantiating chain retrier * fix parsing * cleanup * fix * reset * never raise on http error * remove print * comment * comment * comment * comment * remove prints * add declarative stream to registry * Delete dead code * Add docstrings * quick fix * exponential backoff * fix test * fix * delete unused properties * fix * missing unit tests * uppercase * docstrings * rename to success * compare full request instead of just url * renmae module * rename test file * rename interface * rename default retrier * rename to compositeerrorhandler * fix missing renames * move action to filter * str -> minmaxdatetime * small fixes * plural * add example * handle header variations * also fix wait time from * allow using a regex to extract the value * group() * docstring * add docs * update comment * docstrings * update comment * Update airbyte-cdk/python/airbyte_cdk/sources/declarative/requesters/http_requester.py Co-authored-by: Sherif A. Nada <snadalive@gmail.com> * version: Update Parquet library to latest release (#14502) The upstream Parquet library that is currently pinned for use in the S3 destination plugin is over a year old. The current version is generating invalid schemas for date-time with time-zone fields which appears to be addressed in the `1.12.3` release of the library in commitc72862b613* merge * 🎉 Source Github: improve schema for stream `pull_request_commits` added "null" (#14613) Signed-off-by: Sergey Chvalyuk <grubberr@gmail.com> * Docs: Fixed broken links (#14622) * fixing broken links * more broken links * source-hubspot: change mentioning of Mailchimp into HubSpot doc (#14620) * Helm Chart: Add external temporal option (#14597) * conflict env configmap and chart lock * reverting lock * add eof lines and documentation on values yaml * conflict json file * rollback json * solve conflict * correct minio with new version Co-authored-by: Guy Feldman <gfeldman@86labs.com> * 🎉 Add YAML format to source-file reader (#14588) * Add yaml reader * Update docs * Bumpversion of connector * bump docs * Update pyarrow dependency * Upgrade pandas dependency * auto-bump connector version Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com> * 🎉 Source Okta: add GroupMembers stream (#14380) * add Group_Members stream to okta source - Group_Members return a list of users, the same schema of Users stream. - Create a shared schema users, and both group_members and users sechema use it as a reference. - Add Group_Members stream to source connector * add tests and fix logs schema - fix the test error: None is not one of enums though the enum type includes both string and null, it comes from json schema validatorddb87afad8/jsonschema/_validators.py (L279-L285)- change grouop_members to use id as the cursor field since `filter` is not supported in the query string - fix the abnormal state test on logs stream, when since is abnormally large, until has to defined, an equal or a larger value - remove logs stream from full sync test, because 2 full sync always has a gap -- at least a new log about users or groups api. * last polish before submit the PR - bump docker version - update changelog - add the right abnormal value for logs stream - correct the sample catalog * address comments:: - improve comments for until parameter under the logs stream - add use_cache on groupMembers * add use_cache to Group_Members * change configured_catalog to test * auto-bump connector version Co-authored-by: marcosmarxm <marcosmarxm@gmail.com> Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com> * split test files * renames * missing unit test * add missing unit tests * rename * assert isinstance * start extracting to their own files * use final instead of classmethod * assert we retry 429 errors * Add log * replace asserts with valueexceptions * delete superfluous print statement * fix factory so we don't need to union everything with strings * get class_name from type * remove from class types registry * process error handlers one at a time * sort * delete print statement * comment * comment * format * delete unused file Co-authored-by: Sherif A. Nada <snadalive@gmail.com> Co-authored-by: Tobias Macey <tmacey@boundlessnotions.com> Co-authored-by: Serhii Chvaliuk <grubberr@gmail.com> Co-authored-by: Amruta Ranade <11484018+Amruta-Ranade@users.noreply.github.com> Co-authored-by: Bas Beelen <bjgbeelen@gmail.com> Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com> Co-authored-by: Guy Feldman <gfeldman@86labs.com> Co-authored-by: Christophe Duong <christophe.duong@gmail.com> Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com> Co-authored-by: Yiyang Li <yiyangli2010@gmail.com> Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
84 lines
3.0 KiB
Python
84 lines
3.0 KiB
Python
#
|
|
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
|
|
#
|
|
|
|
import inspect
|
|
|
|
from airbyte_cdk.sources.declarative.interpolation.interpolated_mapping import InterpolatedMapping
|
|
from airbyte_cdk.sources.declarative.interpolation.jinja import JinjaInterpolation
|
|
|
|
"""
|
|
Create a partial on steroids.
|
|
Returns a partial object which when called will behave like func called with the arguments supplied.
|
|
Parameters will be interpolated before the creation of the object
|
|
The interpolation will take in kwargs, and config as parameters that can be accessed through interpolating.
|
|
If any of the parameters are also create functions, they will also be created.
|
|
kwargs are propagated to the recursive method calls
|
|
:param func: Function
|
|
:param args:
|
|
:param keywords:
|
|
:return: partially created object
|
|
"""
|
|
|
|
|
|
def create(func, /, *args, **keywords):
|
|
def newfunc(*fargs, **fkeywords):
|
|
interpolation = JinjaInterpolation()
|
|
all_keywords = {**keywords}
|
|
all_keywords.update(fkeywords)
|
|
|
|
# config is a special keyword used for interpolation
|
|
config = all_keywords.pop("config", None)
|
|
|
|
# options is a special keyword used for interpolation and propagation
|
|
if "options" in all_keywords:
|
|
options = all_keywords.pop("options")
|
|
else:
|
|
options = dict()
|
|
|
|
# create object's partial parameters
|
|
fully_created = _create_inner_objects(all_keywords, options)
|
|
|
|
# interpolate the parameters
|
|
interpolated_keywords = InterpolatedMapping(fully_created, interpolation).eval(config, **{"options": options})
|
|
interpolated_keywords = {k: v for k, v in interpolated_keywords.items() if v}
|
|
|
|
all_keywords.update(interpolated_keywords)
|
|
|
|
# if config is not none, add it back to the keywords mapping
|
|
if config is not None:
|
|
all_keywords["config"] = config
|
|
|
|
kwargs_to_pass_down = _get_kwargs_to_pass_to_func(func, options)
|
|
all_keywords_to_pass_down = _get_kwargs_to_pass_to_func(func, all_keywords)
|
|
try:
|
|
ret = func(*args, *fargs, **{**all_keywords_to_pass_down, **kwargs_to_pass_down})
|
|
except TypeError as e:
|
|
raise Exception(f"failed to create object of type {func} because {e}")
|
|
return ret
|
|
|
|
newfunc.func = func
|
|
newfunc.args = args
|
|
newfunc.kwargs = keywords
|
|
|
|
return newfunc
|
|
|
|
|
|
def _get_kwargs_to_pass_to_func(func, kwargs):
|
|
argspec = inspect.getfullargspec(func)
|
|
kwargs_to_pass_down = set(argspec.kwonlyargs)
|
|
args_to_pass_down = set(argspec.args)
|
|
all_args = args_to_pass_down.union(kwargs_to_pass_down)
|
|
kwargs_to_pass_down = {k: v for k, v in kwargs.items() if k in all_args}
|
|
return kwargs_to_pass_down
|
|
|
|
|
|
def _create_inner_objects(keywords, kwargs):
|
|
fully_created = dict()
|
|
for k, v in keywords.items():
|
|
if type(v) == type(create):
|
|
fully_created[k] = v(kwargs=kwargs)
|
|
else:
|
|
fully_created[k] = v
|
|
return fully_created
|