code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def sqrt(cls, x: 'TensorFluent') -> 'TensorFluent': return cls._unary_op(x, tf.sqrt, tf.float32)
Returns a TensorFluent for the sqrt function. Args: x: The input fluent. Returns: A TensorFluent wrapping the sqrt function.
juraj-google-style
def __hash__(self): return hash(self.path)
Hash function. Returns: return the hash value of its path. NOTE(daiyip): KeyPath shares the same hash of its JSONPath representation (relative form), thus we can lookup a dict with KeyPath key by string, and vice versa.
github-repos
def __init__(self, file_object, encoding='utf-8'): super(FileObjectOutputWriter, self).__init__(encoding=encoding) self._errors = 'strict' self._file_object = file_object
Initializes a file object command line interface output writer. Args: file_object (file): file-like object to read from. encoding (Optional[str]): output encoding.
juraj-google-style
def get_diff_coeff(hvec, n=1): hvec = np.array(hvec, dtype=np.float) acc = len(hvec) exp = np.column_stack(([np.arange(acc)] * acc)) a = (np.vstack(([hvec] * acc)) ** exp) b = np.zeros(acc) b[n] = factorial(n) return np.linalg.solve(a, b)
Helper function to find difference coefficients of an derivative on an arbitrary mesh. Args: hvec (1D array-like): sampling stencil n (int): degree of derivative to find
codesearchnet
def modify(self, modification, obj): for (action, settings) in modification.items(): if (action in self.supported_actions): self.supported_actions[action].__call__(obj, settings) elif self.strict: raise ValueError('{} is not a supported action!'.format(action))
Note that modify makes actual in-place modifications. It does not return a copy. Args: modification (dict): Modification must be {action_keyword : settings}. E.g., {'_set': {'Hello':'Universe', 'Bye': 'World'}} obj (dict/str/object): Object to modify depending on actions. For example, for DictActions, obj will be a dict to be modified. For FileActions, obj will be a string with a full pathname to a file.
codesearchnet
class RunOneDetector(beam.PTransform[beam.PCollection[NestedKeyedInputT], beam.PCollection[NestedKeyedOutputT]]): def __init__(self, detector): self._detector = detector def expand(self, input: beam.PCollection[NestedKeyedInputT]) -> beam.PCollection[NestedKeyedOutputT]: model_id = getattr(self._detector, '_model_id', getattr(self._detector, '_key', 'unknown_model')) model_uuid = f'{model_id}:{uuid.uuid4().hex[:6]}' ret = input | beam.Reshuffle() | f'Score and Learn ({model_uuid})' >> RunScoreAndLearn(self._detector) if self._detector._threshold_criterion: ret = ret | f'Run Threshold Criterion ({model_uuid})' >> RunThresholdCriterion(self._detector._threshold_criterion) return ret
Runs a single anomaly detector on a PCollection of data. This PTransform applies a single `AnomalyDetector` to the input data, including scoring, learning, and thresholding. Args: detector: The `AnomalyDetector` to run.
github-repos
def sort_orbitals(element_pdos): sorted_orbitals = ['s', 'p', 'py', 'pz', 'px', 'd', 'dxy', 'dyz', 'dz2', 'dxz', 'dx2', 'f', 'f_3', 'f_2', 'f_1', 'f_0', 'f1', 'f2', 'f3'] unsorted_keys = element_pdos.keys() sorted_keys = [] for key in sorted_orbitals: if (key in unsorted_keys): sorted_keys.append(key) return sorted_keys
Sort the orbitals of an element's projected density of states. Sorts the orbitals based on a standard format. E.g. s < p < d. Will also sort lm decomposed orbitals. This is useful for plotting/saving. Args: element_pdos (dict): An element's pdos. Should be formatted as a :obj:`dict` of ``{orbital: dos}``. Where dos is a :obj:`~pymatgen.electronic_structure.dos.Dos` object. For example:: {'s': dos, 'px': dos} Returns: list: The sorted orbitals.
codesearchnet
def merge_dicts(dicts, op=operator.add): a = None for b in dicts: if a is None: a = b.copy() else: a = dict(a.items() + b.items() + [(k, op(a[k], b[k])) for k in set(b) & set(a)]) return a
Merge a list of dictionaries. Args: dicts (list): a list of dictionary objects op (operator): an operator item used to merge the dictionaries. Defaults to :py:func:`operator.add`. Returns: dict: the merged dictionary
juraj-google-style
class TFSwinPatchMerging(keras.layers.Layer): def __init__(self, input_resolution: Tuple[int, int], dim: int, norm_layer: Optional[Callable]=None, **kwargs) -> None: super().__init__(**kwargs) self.input_resolution = input_resolution self.dim = dim self.reduction = keras.layers.Dense(2 * dim, use_bias=False, name='reduction') if norm_layer is None: self.norm = keras.layers.LayerNormalization(epsilon=1e-05, name='norm') else: self.norm = norm_layer(name='norm') def maybe_pad(self, input_feature: tf.Tensor, height: int, width: int) -> tf.Tensor: should_pad = height % 2 == 1 or width % 2 == 1 if should_pad: pad_values = ((0, 0), (0, height % 2), (0, width % 2), (0, 0)) input_feature = tf.pad(input_feature, pad_values) return input_feature def call(self, input_feature: tf.Tensor, input_dimensions: Tuple[int, int], training: bool=False) -> tf.Tensor: height, width = input_dimensions batch_size, _, num_channels = shape_list(input_feature) input_feature = tf.reshape(input_feature, (batch_size, height, width, num_channels)) input_feature = self.maybe_pad(input_feature, height, width) input_feature_0 = input_feature[:, 0::2, 0::2, :] input_feature_1 = input_feature[:, 1::2, 0::2, :] input_feature_2 = input_feature[:, 0::2, 1::2, :] input_feature_3 = input_feature[:, 1::2, 1::2, :] input_feature = tf.concat([input_feature_0, input_feature_1, input_feature_2, input_feature_3], -1) input_feature = tf.reshape(input_feature, (batch_size, -1, 4 * num_channels)) input_feature = self.norm(input_feature, training=training) input_feature = self.reduction(input_feature, training=training) return input_feature def build(self, input_shape=None): if self.built: return self.built = True if getattr(self, 'reduction', None) is not None: with tf.name_scope(self.reduction.name): self.reduction.build([None, None, 4 * self.dim]) if getattr(self, 'norm', None) is not None: with tf.name_scope(self.norm.name): self.norm.build([None, None, 4 * self.dim])
Patch Merging Layer. Args: input_resolution (`Tuple[int]`): Resolution of input feature. dim (`int`): Number of input channels. norm_layer (`keras.layer.Layer`, *optional*, defaults to `keras.layers.LayerNormalization`): Normalization layer class.
github-repos
def map(self, callback: Callable[[T], U]) -> 'Option[U]': return self._type.Some(callback(self._val)) if self._is_some else cast('Option[U]', NONE)
Applies the ``callback`` with the contained value as its argument or returns :py:data:`NONE`. Args: callback: The callback to apply to the contained value. Returns: The ``callback`` result wrapped in an :class:`Option` if the contained value is ``Some``, otherwise :py:data:`NONE` Examples: >>> Some(10).map(lambda x: x * x) Some(100) >>> NONE.map(lambda x: x * x) NONE
juraj-google-style
def part_studio_stl(self, did, wid, eid): req_headers = {'Accept': 'application/vnd.onshape.v1+octet-stream'} return self._api.request('get', (((((('/api/partstudios/d/' + did) + '/w/') + wid) + '/e/') + eid) + '/stl'), headers=req_headers)
Exports STL export from a part studio Args: - did (str): Document ID - wid (str): Workspace ID - eid (str): Element ID Returns: - requests.Response: Onshape response data
codesearchnet
def metamodel_from_file(file_name, **kwargs): with codecs.open(file_name, 'r', 'utf-8') as f: lang_desc = f.read() metamodel = metamodel_from_str(lang_desc=lang_desc, file_name=file_name, **kwargs) return metamodel
Creates new metamodel from the given file. Args: file_name(str): The name of the file with textX language description. other params: See metamodel_from_str.
juraj-google-style
def ces_distance(C1, C2): if config.USE_SMALL_PHI_DIFFERENCE_FOR_CES_DISTANCE: return round(small_phi_ces_distance(C1, C2), config.PRECISION) concepts_only_in_C1 = [ c1 for c1 in C1 if not any(c1.emd_eq(c2) for c2 in C2)] concepts_only_in_C2 = [ c2 for c2 in C2 if not any(c2.emd_eq(c1) for c1 in C1)] if not concepts_only_in_C1 or not concepts_only_in_C2: dist = _ces_distance_simple(C1, C2) else: dist = _ces_distance_emd(concepts_only_in_C1, concepts_only_in_C2) return round(dist, config.PRECISION)
Return the distance between two cause-effect structures. Args: C1 (CauseEffectStructure): The first |CauseEffectStructure|. C2 (CauseEffectStructure): The second |CauseEffectStructure|. Returns: float: The distance between the two cause-effect structures in concept space.
juraj-google-style
def _ip_string_from_prefix(self, prefixlen=None): if (not prefixlen): prefixlen = self._prefixlen return self._string_from_ip_int(self._ip_int_from_prefix(prefixlen))
Turn a prefix length into a dotted decimal string. Args: prefixlen: An integer, the netmask prefix length. Returns: A string, the dotted decimal netmask string.
codesearchnet
def add(self, path, compress=None): if os.path.isdir(path): self.add_dir(path, compress) else: self.add_file(path, compress)
Add `path` to the MAR file. If `path` is a file, it will be added directly. If `path` is a directory, it will be traversed recursively and all files inside will be added. Args: path (str): path to file or directory on disk to add to this MAR file compress (str): One of 'xz', 'bz2', or None. Defaults to None.
juraj-google-style
def from_string(cls, contents): mol = None charge = None spin_multiplicity = None params = dict() lines = contents.split('\n') parse_section = False section_name = None section_text = [] ghost_atoms = None for line_num, line in enumerate(lines): l = line.strip().lower() if len(l) == 0: continue if (not parse_section) and (l == "$end" or not l.startswith("$")): raise ValueError("Format error, parsing failed") if parse_section and l != "$end": section_text.append(line) if l.startswith("$") and not parse_section: parse_section = True section_name = l[1:] available_sections = ["comment", "molecule", "rem"] + \ sorted(list(cls.optional_keywords_list)) if section_name not in available_sections: raise ValueError("Unrecognized keyword " + line.strip() + " at line " + str(line_num)) if section_name in params: raise ValueError("duplicated keyword " + line.strip() + "at line " + str(line_num)) if parse_section and l == "$end": func_name = "_parse_" + section_name if func_name not in QcTask.__dict__: raise Exception(func_name + " is not implemented yet, " "please implement it") parse_func = QcTask.__dict__[func_name].__get__(None, QcTask) if section_name == "molecule": mol, charge, spin_multiplicity, ghost_atoms = parse_func(section_text) else: d = parse_func(section_text) params[section_name] = d parse_section = False section_name = None section_text = [] if parse_section: raise ValueError("Format error. " + section_name + " is not " "terminated") jobtype = params["rem"]["jobtype"] title = params.get("comment", None) exchange = params["rem"].get("exchange", "hf") method = params["rem"].get("method", None) correlation = params["rem"].get("correlation", None) basis_set = params["rem"]["basis"] aux_basis_set = params["rem"].get("aux_basis", None) ecp = params["rem"].get("ecp", None) optional_params = None op_keys = set(params.keys()) - {"comment", "rem"} if len(op_keys) > 0: optional_params = dict() for k in op_keys: optional_params[k] = params[k] return QcTask(molecule=mol, charge=charge, spin_multiplicity=spin_multiplicity, jobtype=jobtype, title=title, exchange=exchange, correlation=correlation, basis_set=basis_set, aux_basis_set=aux_basis_set, ecp=ecp, rem_params=params["rem"], optional_params=optional_params, ghost_atoms=ghost_atoms, method=method)
Creates QcInput from a string. Args: contents: String representing a QChem input file. Returns: QcInput object
juraj-google-style
def _broadcast_half(ac_0: _LayerBroadcaster, a_1: RowPartition) -> Tuple[_LayerBroadcaster, RowPartition]: c_1 = ac_0.broadcast_row_partition(a_1) old_value_rowids = array_ops.gather(ac_0.gather_index, c_1.value_rowids()) old_row_starts = array_ops.gather(a_1.row_splits(), old_value_rowids) gather_index = old_row_starts + c_1.offsets_in_rows() return [_LayerBroadcaster.from_gather_index(gather_index), c_1]
Does a NOOP broadcast of a_1. *-ac_0-->* | | a_1 c_1 | | V V *-ac_1-->* Note that by definition this cannot fail: there is always a well-defined NOOP broadcast. This is usually intended as half of broadcasting two shapes together. Args: ac_0: previous LayerBroadcaster a_1: previous RowPartition Returns: [ac_1, c_1] where ac_1 is the next LayerBroadcaster, and c_1 is the broadcast RowPartition
github-repos
def step(self, input_stream, value): reading = IOTileReading(input_stream.encode(), self.tick_count, value) self.sensor_graph.process_input(input_stream, reading, self.rpc_executor)
Step the sensor graph through one since input. The internal tick count is not advanced so this function may be called as many times as desired to input specific conditions without simulation time passing. Args: input_stream (DataStream): The input stream to push the value into value (int): The reading value to push as an integer
codesearchnet
def element(self, using, value): return self._execute(Command.FIND_ELEMENT, {'using': using, 'value': value})
Find an element in the current context. Support: Android iOS Web(WebView) Args: using(str): The element location strategy. value(str): The value of the location strategy. Returns: WebElement Object. Raises: WebDriverException.
codesearchnet
def Wait(self, context, *args):
Wait for and validate an interaction event. This method should block and wait for a specific interaction. For example, this method might wait for a specific message over a TCP connection. Args: context: Context of this event. context.source: Source role for this event. Use the attributes of the source role to validate where the event came from. context.target: Target role for this event. Use the attributes of the target role to validate the event recipient. *args: Additional arguments for validating the event. These arguments should be used to validate the incoming event. Returns: True if the event was successfully validated.
github-repos
def __init__(self, env, keys=None): self.env = env if keys is None: assert self.env.use_object_obs, "Object observations need to be enabled." keys = ["robot-state", "object-state"] self.keys = keys flat_ob = self._flatten_obs(self.env.reset(), verbose=True) self.obs_dim = flat_ob.size high = np.inf * np.ones(self.obs_dim) low = -high self.observation_space = spaces.Box(low=low, high=high) low, high = self.env.action_spec self.action_space = spaces.Box(low=low, high=high)
Initializes the Gym wrapper. Args: env (MujocoEnv instance): The environment to wrap. keys (list of strings): If provided, each observation will consist of concatenated keys from the wrapped environment's observation dictionary. Defaults to robot-state and object-state.
juraj-google-style
def submit_jobs(job_specs): gk = get_api_client() jobs = [] try: for site, job_spec in job_specs: logger.info("Submitting %s on %s" % (job_spec, site)) jobs.append(gk.sites[site].jobs.create(job_spec)) except Exception as e: logger.error("An error occured during the job submissions") logger.error("Cleaning the jobs created") for job in jobs: job.delete() raise(e) return jobs
Submit a job Args: job_spec (dict): The job specifiation (see Grid'5000 API reference)
juraj-google-style
def get_device(ads, **kwargs): filtered = get_devices(ads, **kwargs) if (len(filtered) == 1): return filtered[0] else: serials = [ad.serial for ad in filtered] raise Error(('More than one device matched: %s' % serials))
Finds a unique AndroidDevice instance from a list that has specific attributes of certain values. Deprecated, use `get_devices(ads, **kwargs)[0]` instead. This method will be removed in 1.8. Example: get_device(android_devices, label='foo', phone_number='1234567890') get_device(android_devices, model='angler') Args: ads: A list of AndroidDevice instances. kwargs: keyword arguments used to filter AndroidDevice instances. Returns: The target AndroidDevice instance. Raises: Error: None or more than one device is matched.
codesearchnet
def from_any_pb(pb_type, any_pb): msg = pb_type() if callable(getattr(pb_type, "pb", None)): msg_pb = pb_type.pb(msg) else: msg_pb = msg if not any_pb.Unpack(msg_pb): raise TypeError( "Could not convert {} to {}".format( any_pb.__class__.__name__, pb_type.__name__ ) ) return msg
Converts an ``Any`` protobuf to the specified message type. Args: pb_type (type): the type of the message that any_pb stores an instance of. any_pb (google.protobuf.any_pb2.Any): the object to be converted. Returns: pb_type: An instance of the pb_type message. Raises: TypeError: if the message could not be converted.
juraj-google-style
def _create_job_info(self, job_dir): meta = self._build_job_meta(job_dir) self.logger.debug(('Create job: %s' % meta)) job_record = JobRecord.from_json(meta) job_record.save()
Create information for given job. Meta file will be loaded if exists, and the job information will be saved in db backend. Args: job_dir (str): Directory path of the job.
codesearchnet
def GetTSKFileByPathSpec(self, path_spec): inode = getattr(path_spec, 'inode', None) location = getattr(path_spec, 'location', None) if (inode is not None): tsk_file = self._tsk_file_system.open_meta(inode=inode) elif (location is not None): tsk_file = self._tsk_file_system.open(location) else: raise errors.PathSpecError('Path specification missing inode and location.') return tsk_file
Retrieves the SleuthKit file object for a path specification. Args: path_spec (PathSpec): path specification. Returns: pytsk3.File: TSK file. Raises: PathSpecError: if the path specification is missing inode and location.
codesearchnet
def set_tif(self, interface): if not ((1 << interface) & self.supported_tifs()): raise errors.JLinkException('Unsupported target interface: %s' % interface) res = self._dll.JLINKARM_TIF_Select(interface) if res != 0: return False self._tif = interface return True
Selects the specified target interface. Note that a restart must be triggered for this to take effect. Args: self (Jlink): the ``JLink`` instance interface (int): integer identifier of the interface Returns: ``True`` if target was updated, otherwise ``False``. Raises: JLinkException: if the given interface is invalid or unsupported.
juraj-google-style
def prepare_srcs(deps: list[str], deps_destinations: list[str], srcs_dir: str) -> None: path_to_replace = {'external/local_xla/': 'tensorflow/compiler', 'external/local_tsl/': 'tensorflow'} deps_mapping_dict = {} for deps_destination in deps_destinations: with open(deps_destination, 'r') as deps_destination_file: deps_mapping_dict.update(json.load(deps_destination_file)) for file in deps: for path, val in path_to_replace.items(): if path in file: copy_file(file, os.path.join(srcs_dir, val), path) break else: if 'external' not in file: if file in deps_mapping_dict: dest = deps_mapping_dict[file] if dest: copy_file(file, srcs_dir, None, dest) else: copy_file(file, srcs_dir, None, None)
Rearrange source files in target the target directory. Exclude `external` files and move vendored xla/tsl files accordingly. Args: deps: a list of paths to files. deps_destinations: a list of json files with mapping of deps to their destinations for deps whose original path and path inside the wheel are different. srcs_dir: target directory where files are copied to.
github-repos
async def run(self, login: LoginProtocol): self._print('%d +++| %s', bytes(socket_info.get())) await self._do_greeting(login) while True: resp: Response try: cmd = await self._read_command() except (ConnectionError, EOFError): break except NotParseable as exc: resp = BadCommandResponse(exc) else: try: if isinstance(cmd, NoOpCommand): resp = NoOpResponse(cmd.tag) elif isinstance(cmd, LogoutCommand): resp = Response(Condition.BYE) elif isinstance(cmd, CapabilityCommand): resp = CapabilitiesResponse(self.capabilities) elif self._session is None: if isinstance(cmd, AuthenticateCommand): resp = await self._do_authenticate(login, cmd) elif isinstance(cmd, StartTLSCommand): resp = await self._do_starttls() else: resp = Response(Condition.NO, text='Bad command.') else: if isinstance(cmd, UnauthenticateCommand): resp = await self._do_unauthenticate() else: assert self._session.filter_set is not None state = FilterState(self._session.filter_set, self.config) resp = await state.run(cmd) except Exception: _log.exception('Unhandled exception') resp = Response(Condition.NO, text='Server error.') await self._write_response(resp) if resp.is_bye: break self._print('%d ---| %s', b'<disconnected>')
Start the socket communication with the server greeting, and then enter the command/response cycle. Args: login: The login/authentication function.
juraj-google-style
def __init__(self, layouts: Optional[sparse_core_layout_pb2.SparseCoreTableLayouts]=None): self._checkpoint_layouts = {} self._checkpoint_to_reshard_callback = {} if layouts: for layout in layouts.tables: self._checkpoint_layouts[layout.table_name] = layout
An adapter for TPUEmbeddingV3 checkpoints. Constructs an adapter for TPUEmbeddingV3 to handle layout changes. between checkpoint values and embedding object being restored. Args: layouts: The target layouts required.
github-repos
def PreprocessSources( self, artifacts_registry_object, source_path_specs, resolver_context=None): detected_operating_systems = [] for source_path_spec in source_path_specs: try: file_system, mount_point = self.GetSourceFileSystem( source_path_spec, resolver_context=resolver_context) except (RuntimeError, dfvfs_errors.BackEndError) as exception: logger.error(exception) continue try: searcher = file_system_searcher.FileSystemSearcher( file_system, mount_point) operating_system = self._DetermineOperatingSystem(searcher) if operating_system != definitions.OPERATING_SYSTEM_FAMILY_UNKNOWN: preprocess_manager.PreprocessPluginsManager.RunPlugins( artifacts_registry_object, file_system, mount_point, self.knowledge_base) detected_operating_systems.append(operating_system) finally: file_system.Close() if detected_operating_systems: logger.info('Preprocessing detected operating systems: {0:s}'.format( ', '.join(detected_operating_systems))) self.knowledge_base.SetValue( 'operating_system', detected_operating_systems[0])
Preprocesses the sources. Args: artifacts_registry_object (artifacts.ArtifactDefinitionsRegistry): artifact definitions registry. source_path_specs (list[dfvfs.PathSpec]): path specifications of the sources to process. resolver_context (Optional[dfvfs.Context]): resolver context.
juraj-google-style
def Refresh(self): with requests.Session() as session: session.proxies = self.proxy_config.proxies session.verify = (not self.proxy_config.disable_certificate_validation) session.cert = self.proxy_config.cafile self.creds.refresh(google.auth.transport.requests.Request(session=session))
Uses the Refresh Token to retrieve and set a new Access Token. Raises: google.auth.exceptions.RefreshError: If the refresh fails.
codesearchnet
def to_proto(self, export_scope=None): if export_scope is None or self.queue.name.startswith(export_scope): queue_runner_def = queue_runner_pb2.QueueRunnerDef() queue_runner_def.queue_name = ops.strip_name_scope(self.queue.name, export_scope) for enqueue_op in self.enqueue_ops: queue_runner_def.enqueue_op_name.append(ops.strip_name_scope(enqueue_op.name, export_scope)) queue_runner_def.close_op_name = ops.strip_name_scope(self.close_op.name, export_scope) queue_runner_def.cancel_op_name = ops.strip_name_scope(self.cancel_op.name, export_scope) queue_runner_def.queue_closed_exception_types.extend([errors.error_code_from_exception_type(cls) for cls in self._queue_closed_exception_types]) return queue_runner_def else: return None
Converts this `QueueRunner` to a `QueueRunnerDef` protocol buffer. Args: export_scope: Optional `string`. Name scope to remove. Returns: A `QueueRunnerDef` protocol buffer, or `None` if the `Variable` is not in the specified name scope.
github-repos
def send(self, request): if (self.call is None): raise ValueError('Can not send() on an RPC that has never been open()ed.') if self.call.is_active(): self._request_queue.put(request) else: next(self.call)
Queue a message to be sent on the stream. Send is non-blocking. If the underlying RPC has been closed, this will raise. Args: request (protobuf.Message): The request to send.
codesearchnet
def make_gradients(dims=DEFAULT_DIMS): return np.meshgrid(np.linspace(0.0, 1.0, dims[0]), np.linspace(0.0, 1.0, dims[1]))
Makes a pair of gradients to generate textures from numpy primitives. Args: dims (pair): the dimensions of the surface to create Returns: pair: A pair of surfaces.
codesearchnet
def zsh_complete(self, path, cmd, *cmds, sourceable=False): grouping = (internal.zsh_version() >= (5, 4)) path = pathlib.Path(path) firstline = [' firstline.extend(cmds) subcmds = list(self.subcmds.keys()) with path.open('w') as zcf: print(*firstline, end='\n\n', file=zcf) print('function _{} {{'.format(cmd), file=zcf) print('local line', file=zcf) print('_arguments -C', end=BLK, file=zcf) if subcmds: substrs = ["{}\\:'{}'".format(sub, self.subcmds[sub].help) for sub in subcmds] print('"1:Commands:(({}))"'.format(' '.join(substrs)), end=BLK, file=zcf) self._zsh_comp_command(zcf, None, grouping) if subcmds: print("'*::arg:->args'", file=zcf) print('case $line[1] in', file=zcf) for sub in subcmds: print('{sub}) _{cmd}_{sub} ;;'.format(sub=sub, cmd=cmd), file=zcf) print('esac', file=zcf) print('}', file=zcf) for sub in subcmds: print('\nfunction _{}_{} {{'.format(cmd, sub), file=zcf) print('_arguments', end=BLK, file=zcf) self._zsh_comp_command(zcf, sub, grouping) print('}', file=zcf) if sourceable: print('\ncompdef _{0} {0}'.format(cmd), *cmds, file=zcf)
Write zsh compdef script. Args: path (path-like): desired path of the compdef script. cmd (str): command name that should be completed. cmds (str): extra command names that should be completed. sourceable (bool): if True, the generated file will contain an explicit call to ``compdef``, which means it can be sourced to activate CLI completion.
codesearchnet
def GetEntries(self, parser_mediator, cache=None, database=None, **kwargs): if (database is None): raise ValueError('Invalid database.') for (table_name, callback_method) in iter(self._tables.items()): if parser_mediator.abort: break if (not callback_method): continue callback = getattr(self, callback_method, None) if (callback is None): logger.warning('[{0:s}] missing callback method: {1:s} for table: {2:s}'.format(self.NAME, callback_method, table_name)) continue esedb_table = database.get_table_by_name(table_name) if (not esedb_table): logger.warning('[{0:s}] missing table: {1:s}'.format(self.NAME, table_name)) continue callback(parser_mediator, cache=cache, database=database, table=esedb_table, **kwargs)
Extracts event objects from the database. Args: parser_mediator (ParserMediator): mediates interactions between parsers and other components, such as storage and dfvfs. cache (Optional[ESEDBCache]): cache. database (Optional[pyesedb.file]): ESE database. Raises: ValueError: If the database attribute is not valid.
codesearchnet
def actor_checkpoint_info(self, actor_id): self._check_connected() message = self._execute_command( actor_id, "RAY.TABLE_LOOKUP", ray.gcs_utils.TablePrefix.ACTOR_CHECKPOINT_ID, "", actor_id.binary(), ) if message is None: return None gcs_entry = ray.gcs_utils.GcsTableEntry.GetRootAsGcsTableEntry( message, 0) entry = ( ray.gcs_utils.ActorCheckpointIdData.GetRootAsActorCheckpointIdData( gcs_entry.Entries(0), 0)) checkpoint_ids_str = entry.CheckpointIds() num_checkpoints = len(checkpoint_ids_str) assert len(checkpoint_ids_str) % ID_SIZE == 0 checkpoint_ids = [ ray.ActorCheckpointID( checkpoint_ids_str[(i * ID_SIZE):((i + 1) * ID_SIZE)]) for i in range(num_checkpoints) ] return { "ActorID": ray.utils.binary_to_hex(entry.ActorId()), "CheckpointIds": checkpoint_ids, "Timestamps": [ entry.Timestamps(i) for i in range(num_checkpoints) ], }
Get checkpoint info for the given actor id. Args: actor_id: Actor's ID. Returns: A dictionary with information about the actor's checkpoint IDs and their timestamps.
juraj-google-style
def encode(self, spec, value, minimum_rank=0): raise NotImplementedError(f'{type(self).__name__}.encode')
Encodes `value` as a nest of batchable `Tensor` or `CompositeTensor`. Args: spec: The TypeSpec of the value to encode. value: A value compatible with `spec`. minimum_rank: The minimum rank for the returned Tensors, CompositeTensors, and ExtensionType values. This can be used to ensure that the encoded values can be unbatched this number of times. If `minimum_rank>0`, then `t.shape[:minimum_rank]` must be compatible for all values `t` returned by `encode`. Returns: A nest (as defined by `tf.nest`) of `tf.Tensor`s, batchable `tf.CompositeTensor`s, or `tf.ExtensionType`s. Stacking, unstacking, or concatenating these encoded values and then decoding the result must be equivalent to stacking, unstacking, or concatenating the original values.
github-repos
def batch_shuffle(index_array, batch_size): batch_count = int(len(index_array) / batch_size) last_batch = index_array[batch_count * batch_size:] index_array = index_array[:batch_count * batch_size] index_array = index_array.reshape((batch_count, batch_size)) np.random.shuffle(index_array) index_array = index_array.flatten() return np.append(index_array, last_batch)
Shuffles an array in a batch-wise fashion. Useful for shuffling HDF5 arrays (where one cannot access arbitrary indices). Args: index_array: array of indices to be shuffled. batch_size: integer. Returns: The `index_array` array, shuffled in a batch-wise fashion.
github-repos
def GetPasswdMap(self, since=None): return PasswdUpdateGetter().GetUpdates(self, self.conf['passwd_url'], since)
Return the passwd map from this source. Args: since: Get data only changed since this timestamp (inclusive) or None for all data. Returns: instance of passwd.PasswdMap
github-repos
def __init__(self, browser, **kwargs): if len(kwargs) > 1: raise TypeError('BrowserQuery() takes at most 1 keyword argument.') if not kwargs: raise TypeError('Must pass a query keyword argument to BrowserQuery().') query_name, query_value = list(kwargs.items())[0] if query_name not in QUERY_TYPES: raise TypeError(u'{} is not a supported query type for BrowserQuery()'.format(query_name)) def query_fn(): return getattr(browser, QUERY_TYPES[query_name])(query_value) super(BrowserQuery, self).__init__( query_fn, desc=u"BrowserQuery({}={!r})".format(query_name, query_value), ) self.browser = browser
Generate a query over a browser. Args: browser (selenium.webdriver): A Selenium-controlled browser. Keyword Args: css (str): A CSS selector. xpath (str): An XPath selector. Returns: BrowserQuery Raises: TypeError: The query must be passed either a CSS or XPath selector, but not both.
juraj-google-style
def __init__(self, tsk_attribute): super(TSKAttribute, self).__init__() self._tsk_attribute = tsk_attribute
Initializes an attribute. Args: tsk_attribute (pytsk3.Attribute): TSK attribute.
juraj-google-style
def tabledata_insert_all(self, table_name, rows): url = Api._ENDPOINT + (Api._TABLES_PATH % table_name) + "/insertAll" data = { 'kind': 'bigquery 'rows': rows } return datalab.utils.Http.request(url, data=data, credentials=self._credentials)
Issues a request to insert data into a table. Args: table_name: the name of the table as a tuple of components. rows: the data to populate the table, as a list of dictionaries. Returns: A parsed result object. Raises: Exception if there is an error performing the operation.
juraj-google-style
def __init__(self, channel): self.GetVersion = channel.unary_unary( '/versionpb.API/GetVersion', request_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString, response_deserializer=client_dot_version_dot_versionpb_dot_version__pb2.Version.FromString, )
Constructor. Args: channel: A grpc.Channel.
juraj-google-style
def _load(self, dataset='train'): data, labels = None, None if dataset is 'train': files = [os.path.join(self.cifar10_dir, 'data_batch_%d' % i) for i in range(1, 6)] else: files = [os.path.join(self.cifar10_dir, 'test_batch')] for file in files: if not os.path.exists(file): raise FileNotFoundError('Failed to find file: ' + file) for file in files: with open(file, 'rb') as f: cifar10 = pickle.load(f, encoding='latin1') if labels is None: labels = np.array(cifar10['labels']) else: labels = np.concatenate((labels, cifar10['labels']), axis=0) if data is None: data = cifar10['data'] else: data = np.concatenate((data, cifar10['data']), axis=0) data = np.array(data, dtype=float) / 255.0 data = data.reshape([-1, self.num_channels, self.img_size, self.img_size]) data = data.transpose([0, 2, 3, 1]) labels = np.eye(self.num_classes)[np.array(labels).reshape(-1)] if dataset is 'train': self._train_data, self._train_labels = data, labels else: self._test_data, self._test_labels = data, labels
Load the data in memory. Args: dataset: string in ['train', 'test']
juraj-google-style
def seek(self, relative_position): self._player_interface.Seek(Int64(((1000.0 * 1000) * relative_position))) self.seekEvent(self, relative_position)
Seek the video by `relative_position` seconds Args: relative_position (float): The position in seconds to seek to.
codesearchnet
def adduser(name, username, **kwargs): try: group_obj = _get_group_object(name) except pywintypes.com_error as exc: msg = 'Failed to access group {0}. {1}'.format( name, win32api.FormatMessage(exc.excepinfo[5])) log.error(msg) return False existing_members = [_get_username(x) for x in group_obj.members()] username = salt.utils.win_functions.get_sam_name(username) try: if username not in existing_members: group_obj.Add('WinNT: log.info('Added user %s', username) else: log.warning('User %s is already a member of %s', username, name) return False except pywintypes.com_error as exc: msg = 'Failed to add {0} to group {1}. {2}'.format( username, name, win32api.FormatMessage(exc.excepinfo[5])) log.error(msg) return False return True
Add a user to a group Args: name (str): The name of the group to modify username (str): The name of the user to add to the group Returns: bool: ``True`` if successful, otherwise ``False`` CLI Example: .. code-block:: bash salt '*' group.adduser foo username
juraj-google-style
def get_summary_string(self): from rez.plugin_managers import plugin_manager txt = ('Rez %s' % __version__) txt += ('\n\n%s' % plugin_manager.get_summary_string()) return txt
Get a string summarising the state of Rez as a whole. Returns: String.
codesearchnet
def save(self, force=False): if ((not self._success) and (not force)): raise ConfigError('The config file appears to be corrupted:\n\n {fname}\n\nBefore attempting to save the configuration, please either fix the config file manually, or overwrite it with a blank configuration as follows:\n\n from dustmaps.config import config\n config.reset()\n\n'.format(fname=self.fname)) with open(self.fname, 'w') as f: json.dump(self._options, f, indent=2)
Saves the configuration to a JSON, in the standard config location. Args: force (Optional[:obj:`bool`]): Continue writing, even if the original config file was not loaded properly. This is dangerous, because it could cause the previous configuration options to be lost. Defaults to :obj:`False`. Raises: :obj:`ConfigError`: if the configuration file was not successfully loaded on initialization of the class, and :obj:`force` is :obj:`False`.
codesearchnet
def getline(self, lnum=None): return (self._vim.current.buffer[lnum] if lnum else self._vim.current.line)
Get a line from the current buffer. Args: lnum (Optional[str]): Number of the line to get, current if ``None``. Todo: - Give this more behavior of Vim ``getline()``? - ``buffer[index]`` is zero-based, this is probably too confusing
codesearchnet
def forward(self, hidden_states): forwarded_states = self.mlp(hidden_states) output = hidden_states + self.norm(forwarded_states) return output
Args: hidden_states (`torch.Tensor`) : [num_groups, tokens_per_group, hidden_dim] inputs to send to experts. Returns: torch.Tensor[num_groups, tokens_per_group, hidden_dim]
github-repos
def to_str(value): if ((sys.version_info.major < 3) and isinstance(value, six.string_types)): return value return str(value)
Convert the input to a string, unless it is a unicode string in Python 2. Unicode strings are supported as native strings in Python 3, but ``str()`` cannot be invoked on unicode strings in Python 2, so we need to check for that case when converting user-specified values to strings. Args: value: The value to convert to a string. Returns: str or unicode: The string representation of the value or the unicode string itself.
codesearchnet
def browse(self, folder, levels=None, prefix=None): assert (isinstance(levels, int) or (levels is None)) assert (isinstance(prefix, string_types) or (prefix is None)) return self.get('browse', params={'folder': folder, 'levels': levels, 'prefix': prefix})
Returns the directory tree of the global model. Directories are always JSON objects (map/dictionary), and files are always arrays of modification time and size. The first integer is the files modification time, and the second integer is the file size. Args: folder (str): The root folder to traverse. levels (int): How deep within the tree we want to dwell down. (0 based, defaults to unlimited depth) prefix (str): Defines a prefix within the tree where to start building the structure. Returns: dict
codesearchnet
def stop_gradient(input, name=None): if isinstance(input, composite_tensor.CompositeTensor) and (not _pywrap_utils.IsResourceVariable(input)): return nest.map_structure(stop_gradient, input, expand_composites=True) with record.stop_recording(): return gen_array_ops.stop_gradient(input, name=name)
Stops gradient computation. NOTE: This docstring is patched out below. See tensorflow/core/api_def/base_api/api_def_StopGradient.pbtxt for the full docstring. That file determines the public documentation page. Args: input: A `Tensor`. name: A name for this operation. Returns: A `Tensor`. Has the same dtype as `input`.
github-repos
def add_gripper(self, arm_name, gripper): if (arm_name in self.grippers): raise ValueError('Attempts to add multiple grippers to one body') arm_subtree = self.worldbody.find(". for actuator in gripper.actuator: if (actuator.get('name') is None): raise XMLError('Actuator has no name') if (not actuator.get('name').startswith('gripper')): raise XMLError("Actuator name {} does not have prefix 'gripper'".format(actuator.get('name'))) for body in gripper.worldbody: arm_subtree.append(body) self.merge(gripper, merge_body=False) self.grippers[arm_name] = gripper
Mounts gripper to arm. Throws error if robot already has a gripper or gripper type is incorrect. Args: arm_name (str): name of arm mount gripper (MujocoGripper instance): gripper MJCF model
codesearchnet
def AddDirectory(self, path): if self.file_system.FileEntryExistsByPath(path): raise ValueError('Path: {0:s} already set.'.format(path)) self._AddParentDirectories(path) self.file_system.AddFileEntry(path, file_entry_type=definitions.FILE_ENTRY_TYPE_DIRECTORY)
Adds a directory to the fake file system. Note that this function will create parent directories if needed. Args: path (str): path of the directory within the fake file system. Raises: ValueError: if the path is already set.
codesearchnet
def read_from_hdx(identifier, configuration=None): if is_valid_uuid(identifier) is False: raise HDXError('%s is not a valid resource id!' % identifier) resource = Resource(configuration=configuration) result = resource._load_from_hdx('resource', identifier) if result: return resource return None
Reads the resource given by identifier from HDX and returns Resource object Args: identifier (str): Identifier of resource configuration (Optional[Configuration]): HDX configuration. Defaults to global configuration. Returns: Optional[Resource]: Resource object if successful read, None if not
juraj-google-style
def merge_dictionaries(dicts, merge_lists=False): dict1 = dicts[0] for other_dict in dicts[1:]: merge_two_dictionaries(dict1, other_dict, merge_lists=merge_lists) return dict1
Merges all dictionaries in dicts into a single dictionary and returns result Args: dicts (List[DictUpperBound]): Dictionaries to merge into the first one in the list merge_lists (bool): Whether to merge lists (True) or replace lists (False). Default is False. Returns: DictUpperBound: Merged dictionary
codesearchnet
def get_fail_graph(self, failure_index=None): phase, _ = self._get_failed_phase(failure_index) return phase.get_graph()
Returns a graph showing a solve failure. Args: failure_index: See `failure_reason` Returns: A pygraph.digraph object.
juraj-google-style
def install(self, apk_path, destination_dir=None, timeout_ms=None): if not destination_dir: destination_dir = '/data/local/tmp/' basename = os.path.basename(apk_path) destination_path = destination_dir + basename self.push(apk_path, destination_path, timeout_ms=timeout_ms) return self.Shell('pm install -r "%s"' % destination_path, timeout_ms=timeout_ms)
Install apk to device. Doesn't support verifier file, instead allows destination directory to be overridden. Arguments: apk_path: Local path to apk to install. destination_dir: Optional destination directory. Use /system/app/ for persistent applications. timeout_ms: Expected timeout for pushing and installing. Returns: The pm install output.
juraj-google-style
def delta_E(self): site_delta_E = (self.final_site.energy - self.initial_site.energy) if self.nearest_neighbour_energy: site_delta_E += self.nearest_neighbour_delta_E() if self.coordination_number_energy: site_delta_E += self.coordination_number_delta_E() return site_delta_E
The change in system energy if this jump were accepted. Args: None Returns: (Float): delta E
codesearchnet
def send(self, message_type, message, connection_id, one_way=False): try: self._network.send(message_type, message, connection_id, one_way=one_way) except ValueError: LOGGER.debug('Connection %s is no longer valid. Removing from list of peers.', connection_id) if (connection_id in self._peers): del self._peers[connection_id]
Sends a message via the network. Args: message_type (str): The type of the message. message (bytes): The message to be sent. connection_id (str): The connection to send it to.
codesearchnet
def when_matches(self, path, good_value, bad_values=None, timeout=None, event_timeout=None): future = self.when_matches_async(path, good_value, bad_values) self.wait_all_futures( future, timeout=timeout, event_timeout=event_timeout)
Resolve when an path value equals value Args: path (list): The path to wait to good_value (object): the value to wait for bad_values (list): values to raise an error on timeout (float): time in seconds to wait for responses, wait forever if None event_timeout: maximum time in seconds to wait between each response event, wait forever if None
juraj-google-style
def get(cls, session, record_id, endpoint_override=None): cls._check_implements('get') try: return cls( endpoint_override or '/%s/%d.json' % ( cls.__endpoint__, record_id, ), singleton=True, session=session, ) except HelpScoutRemoteException as e: if e.status_code == 404: return None else: raise
Return a specific record. Args: session (requests.sessions.Session): Authenticated session. record_id (int): The ID of the record to get. endpoint_override (str, optional): Override the default endpoint using this. Returns: helpscout.BaseModel: A record singleton, if existing. Otherwise ``None``.
juraj-google-style
def GetNetgroupMap(self, since=None): return NetgroupUpdateGetter().GetUpdates(self, self.conf['netgroup_url'], since)
Return the netgroup map from this source. Args: since: Get data only changed since this timestamp (inclusive) or None for all data. Returns: instance of netgroup.NetgroupMap
github-repos
def derive_value(self, value): return IonEvent(self.event_type, self.ion_type, value, self.field_name, self.annotations, self.depth)
Derives a new event from this one setting the ``value`` attribute. Args: value: (any): The value associated with the derived event. Returns: IonEvent: The newly generated non-thunk event.
codesearchnet
def delete_files(file_paths): if len(file_paths) == 0: raise RuntimeError('Clean up failed. Invalid file path: %s.' % file_paths) FileSystems.delete(file_paths)
A function to clean up files or directories using ``FileSystems``. Glob is supported in file path and directories will be deleted recursively. Args: file_paths: A list of strings contains file paths or directories.
github-repos
def register(self, address, retry=True): logger.debug("<%s> Sending REGISTER request to: %s" % (str(self.cuuid), str(address))) if not self.listener.listening: logger.warning("Neteria client is not listening.") message = {"method": "REGISTER", "cuuid": str(self.cuuid)} if self.encryption: message["encryption"] = [self.encryption.n, self.encryption.e] self.listener.send_datagram( serialize_data(message, self.compression, encryption=False), address) if retry: self.register_retries = 0 self.listener.call_later( self.timeout, self.retransmit, {"method": "REGISTER", "address": address})
This function will send a register packet to the discovered Neteria server. Args: address (tuple): A tuple of the (address, port) to send the register request to. retry (boolean): Whether or not we want to reset the current number of registration retries to 0. Returns: None Examples: >>> address ('192.168.0.20', 40080)
juraj-google-style
def CompileReport(self, mediator): report_text = 'Tagging plugin produced {0:d} tags.\n'.format( self._number_of_event_tags) self._number_of_event_tags = 0 return reports.AnalysisReport(plugin_name=self.NAME, text=report_text)
Compiles an analysis report. Args: mediator (AnalysisMediator): mediates interactions between analysis plugins and other components, such as storage and dfvfs. Returns: AnalysisReport: analysis report.
juraj-google-style
def __init__(self, callback): self._callback = callback self._brocade_tunnels = brocade_tunnels(callback=pynos.utilities.return_xml)
VCS init function Args: callback: Callback function that will be called for each action Returns: VCS Object Raises: None
juraj-google-style
def _getClassInstance(path, args=None): if not path.endswith(".py"): return None if args is None: args = {} classname = AtomShieldsScanner._getClassName(path) basename = os.path.basename(path).replace(".py", "") sys.path.append(os.path.dirname(path)) try: mod = __import__(basename, globals(), locals(), [classname], -1) class_ = getattr(mod, classname) instance = class_(**args) except Exception as e: AtomShieldsScanner._debug("[!] %s" % e) return None finally: sys.path.remove(os.path.dirname(path)) return instance
Returns a class instance from a .py file. Args: path (str): Absolute path to .py file args (dict): Arguments passed via class constructor Returns: object: Class instance or None
juraj-google-style
def _ParseRecordString(self, record_strings_data, record_strings_data_offset, string_offset): if (string_offset == 0): return None if (string_offset & self._STRING_OFFSET_MSB): if ((string_offset >> 60) != 8): raise errors.ParseError('Invalid inline record string flag.') string_size = ((string_offset >> 56) & 15) if (string_size >= 8): raise errors.ParseError('Invalid inline record string size.') string_data = bytes(bytearray([((string_offset >> (8 * byte_index)) & 255) for byte_index in range(6, (- 1), (- 1))])) try: return string_data[:string_size].decode('utf-8') except UnicodeDecodeError as exception: raise errors.ParseError('Unable to decode inline record string with error: {0!s}.'.format(exception)) data_offset = (string_offset - record_strings_data_offset) record_string_map = self._GetDataTypeMap('asl_record_string') try: record_string = self._ReadStructureFromByteStream(record_strings_data[data_offset:], string_offset, record_string_map) except (ValueError, errors.ParseError) as exception: raise errors.ParseError('Unable to parse record string at offset: 0x{0:08x} with error: {1!s}'.format(string_offset, exception)) return record_string.string.rstrip('\x00')
Parses a record string. Args: record_strings_data (bytes): record strings data. record_strings_data_offset (int): offset of the record strings data relative to the start of the file. string_offset (int): offset of the string relative to the start of the file. Returns: str: record string or None if string offset is 0. Raises: ParseError: if the record string cannot be parsed.
codesearchnet
def __init__(self, declaration): self._address_space = None self._type_qualifiers = [] self._basic_ctype = '' self._vector_type_length = None self._nmr_pointer_stars = 0 self._pointer_qualifiers = [] self._name = '' self._array_sizes = [] param = self class Semantics: def type_qualifiers(self, ast): if ast in param._type_qualifiers: raise ValueError('The pre-type qualifier "{}" is present multiple times.'.format(ast)) param._type_qualifiers.append(ast) return ast def address_space(self, ast): param._address_space = ''.join(ast) return ''.join(ast) def basic_ctype(self, ast): param._basic_ctype = ast return ast def vector_type_length(self, ast): param._vector_type_length = int(ast) return ast def pointer_star(self, ast): param._nmr_pointer_stars += 1 return ast def pointer_qualifiers(self, ast): if ast in param._pointer_qualifiers: raise ValueError('The pre-type qualifier "{}" is present multiple times.'.format(ast)) param._pointer_qualifiers.append(ast) return ast def name(self, ast): param._name = ast return ast def array_size(self, ast): param._array_sizes.append(int(ast[1:-1])) return ast _cl_data_type_parser.parse(declaration, semantics=Semantics())
Creates a new function parameter for the CL functions. Args: declaration (str): the declaration of this parameter. For example ``global int foo``.
juraj-google-style
def set_message(self, title, msg, typ, url=None): return self.user.send_notification(title=title, message=msg, typ=typ, url=url)
Sets user notification message. Args: title: Msg. title msg: Msg. text typ: Msg. type url: Additional URL (if exists) Returns: Message ID.
codesearchnet
def l2_loss(tensor, weight=1.0, scope=None): with tf.name_scope(scope, 'L2Loss', [tensor]): weight = tf.convert_to_tensor(weight, dtype=tensor.dtype.base_dtype, name='loss_weight') loss = tf.multiply(weight, tf.nn.l2_loss(tensor), name='value') tf.add_to_collection(LOSSES_COLLECTION, loss) return loss
Define a L2Loss, useful for regularize, i.e. weight decay. Args: tensor: tensor to regularize. weight: an optional weight to modulate the loss. scope: Optional scope for name_scope. Returns: the L2 loss op.
codesearchnet
def _update_cross_replica(self, update_fn, value, **kwargs): values_util.mark_as_unsaveable() return self.distribute_strategy.extended.update(self, update_fn, args=(value,), kwargs=kwargs, group=True)
Applies updates across replicas. Args: update_fn: A callable to pass to `strategy.extended.update` to update the variable. It should has the same signature as `Variable.assign()`. value: value to be passed to `update_fn`. **kwargs: remaining arguments to `update_fn`. Returns: Updated variable or `tf.Operation`.
github-repos
def get_name_scope(self) -> str: return self._name_stack
Returns the current name scope. For example: ```python with tf.name_scope('scope1'): with tf.name_scope('scope2'): print(tf.compat.v1.get_default_graph().get_name_scope()) ``` would print the string `scope1/scope2`. Returns: A string representing the current name scope.
github-repos
def exec_python(attr, src, executable='python'): import subprocess if isinstance(src, basestring): src = [src] p = popen([executable, '-c', '; '.join(src)], stdout=subprocess.PIPE, stderr=subprocess.PIPE) (out, err) = p.communicate() if p.returncode: from rez.exceptions import InvalidPackageError raise InvalidPackageError(("Error determining package attribute '%s':\n%s" % (attr, err))) return out.strip()
Runs a python subproc to calculate a package attribute. Args: attr (str): Name of package attribute being created. src (list of str): Python code to execute, will be converted into semicolon-delimited single line of code. Returns: str: Output of python process.
codesearchnet
def register(config_class, feature_extractor_class, exist_ok=False): FEATURE_EXTRACTOR_MAPPING.register(config_class, feature_extractor_class, exist_ok=exist_ok)
Register a new feature extractor for this class. Args: config_class ([`PretrainedConfig`]): The configuration corresponding to the model to register. feature_extractor_class ([`FeatureExtractorMixin`]): The feature extractor to register.
github-repos
def __div__(self, other): return self
DEPRECATED: Use `__floordiv__` via `x // y` instead. This function exists only for backwards compatibility purposes; new code should use `__floordiv__` via the syntax `x // y`. Using `x // y` communicates clearly that the result rounds down, and is forward compatible to Python 3. Args: other: Another `Dimension`. Returns: A `Dimension` whose value is the integer quotient of `self` and `other`.
github-repos
def get_completed_task(self, task, timeout=-1): self.__wait_task_completion(task, timeout) return self.get(task)
Waits until the task is completed and returns the task resource. Args: task: TaskResource timeout: Timeout in seconds Returns: dict: TaskResource
juraj-google-style
def subtract_business_days(self, date_tensor, num_days, roll_convention=constants.BusinessDayConvention.NONE): return self.add_business_days(date_tensor, -num_days, roll_convention)
Adds given number of business days to given dates. Note that this is different from calling `subtract_period_and_roll` with PeriodType.DAY. For example, subtracting 5 business days from Friday gives the previous Friday (unless there are holidays on this week or previous Friday). Subtracting 5 days and rolling means landing on Sunday and then rolling either to Monday or to Friday, depending on the roll convention. If any of the dates in `date_tensor` are not business days, they will be rolled to business days before doing the subtraction. If `roll_convention` is `NONE`, and any dates are not business days, an exception is raised. Args: date_tensor: `DateTensor` of dates to advance from. num_days: Tensor of int32 type broadcastable to `date_tensor`. roll_convention: BusinessDayConvention. Determines how to roll a date that falls on a holiday. Returns: The resulting `DateTensor`.
github-repos
def is_for_driver_task(self): return all(((len(x) == 0) for x in [self.module_name, self.class_name, self.function_name]))
See whether this function descriptor is for a driver or not. Returns: True if this function descriptor is for driver tasks.
codesearchnet
def guess_file_type(kind, filepath=None, youtube_id=None, web_url=None, encoding=None): if youtube_id: return FileTypes.YOUTUBE_VIDEO_FILE elif web_url: return FileTypes.WEB_VIDEO_FILE elif encoding: return FileTypes.BASE64_FILE else: ext = os.path.splitext(filepath)[1][1:].lower() if kind in FILE_TYPE_MAPPING and ext in FILE_TYPE_MAPPING[kind]: return FILE_TYPE_MAPPING[kind][ext] return None
guess_file_class: determines what file the content is Args: filepath (str): filepath of file to check Returns: string indicating file's class
juraj-google-style
def get_replicated_var_handle(self, name: Text, handle_id: Text, vars_: Union[List[core_types.Tensor], List[variables.Variable]], is_mirrored: bool=False, is_packed: bool=False) -> core_types.Tensor: device_assignment = _enclosing_tpu_device_assignment() handle = self._replicated_vars.get(handle_id) if handle is not None: return handle if device_assignment is not None and (not is_packed): job_name = pydev.DeviceSpec.from_string(vars_[0].device).job devices_to_vars = {device_util.canonicalize(v.device): v for v in vars_} replicated_vars = [] for replica_id in range(device_assignment.num_replicas): for logical_core in range(device_assignment.num_cores_per_replica): device = device_util.canonicalize(device_assignment.tpu_device(replica=replica_id, logical_core=logical_core, job=job_name)) if device in devices_to_vars: replicated_vars.append(devices_to_vars[device]) break else: raise ValueError('Failed to find a variable on any device in replica {} for current device assignment'.format(replica_id)) else: replicated_vars = vars_ _, graph = _enclosing_tpu_context_and_graph() with graph.as_default(): if isinstance(replicated_vars[0], variables.Variable): replicated_vars = [v.handle for v in replicated_vars] saved_context = graph._get_control_flow_context() graph._set_control_flow_context(self.outer_context) handle = tpu_ops.tpu_replicated_input(replicated_vars, name=name + '/handle', is_mirrored_variable=is_mirrored, is_packed=is_packed) graph._set_control_flow_context(saved_context) self._replicated_vars[handle_id] = handle return handle
Returns a variable handle for replicated TPU variable 'var'. This is a method used by an experimental replicated variable implementation and is not intended as a public API. Args: name: The common name of the variable. handle_id: Unique ID of the variable handle, used as the cache key. vars_: The replicated TPU variables or handles. is_mirrored: Whether the variables are mirrored, which guarantees the values in each replica are always the same. is_packed: Whether the replicated variables are packed into one variable. Returns: The handle of the TPU replicated input node.
github-repos
def _build(self, *args, **kwargs): flattened = nest.flatten([args, kwargs]) merged_flattened = [(merge_leading_dims(inp, self._n_dims) if (inp is not None) else None) for inp in flattened] (merged_args, merged_kwargs) = nest.pack_sequence_as([args, kwargs], merged_flattened) results = self._module(*merged_args, **merged_kwargs) example_input = tf.convert_to_tensor(flattened[self._input_example_index]) def _split_to_original_leading_dims(result): if (result is None): return None else: return split_leading_dim(result, example_input, self._n_dims) flat_results = nest.flatten(results) flat_unmerged_results = [_split_to_original_leading_dims(result) for result in flat_results] return nest.pack_sequence_as(results, flat_unmerged_results)
Connects the BatchApply module into the graph. Args: *args: a Tensor or a nested list or dictionary of Tensors. The input tensors will have their first dimensions merged, then an op or a module will be called on the input. The first dimension of the output tensor(s) will be split again based on the leading dimensions of the first input tensor. **kwargs: Dictionary of named arguments; used in the same way as `*args`. Returns: A Tensor or nested list or dictionary of Tensors as a result of applying the process above. ("None" return values are also supported.)
codesearchnet
def __init__(self, *, dtype: Type[np.number] = np.complex64, noise: devices.NoiseModel = devices.NO_NOISE): if dtype not in {np.complex64, np.complex128}: raise ValueError( 'dtype must be complex64 or complex128, was {}'.format(dtype)) self._dtype = dtype self.noise = noise
Density matrix simulator. Args: dtype: The `numpy.dtype` used by the simulation. One of `numpy.complex64` or `numpy.complex128` noise: A noise model to apply while simulating.
juraj-google-style
def get_ir_reciprocal_mesh(self, mesh=(10, 10, 10), is_shift=(0, 0, 0)): shift = np.array([(1 if i else 0) for i in is_shift]) (mapping, grid) = spglib.get_ir_reciprocal_mesh(np.array(mesh), self._cell, is_shift=shift, symprec=self._symprec) results = [] for (i, count) in zip(*np.unique(mapping, return_counts=True)): results.append((((grid[i] + (shift * (0.5, 0.5, 0.5))) / mesh), count)) return results
k-point mesh of the Brillouin zone generated taken into account symmetry.The method returns the irreducible kpoints of the mesh and their weights Args: mesh (3x1 array): The number of kpoint for the mesh needed in each direction is_shift (3x1 array): Whether to shift the kpoint grid. (1, 1, 1) means all points are shifted by 0.5, 0.5, 0.5. Returns: A list of irreducible kpoints and their weights as a list of tuples [(ir_kpoint, weight)], with ir_kpoint given in fractional coordinates
codesearchnet
def __init__(self, name=None, options=None): del options rr = gen_io_ops.lmdb_reader(name=name) super(LMDBReader, self).__init__(rr)
Create a LMDBReader. Args: name: A name for the operation (optional). options: A LMDBRecordOptions object (optional).
github-repos
def dump(self, content, entry_type): new_content = copy.deepcopy(content) new_content['Type'] = entry_type.value with self._lock: with io.open(self._path, 'a', encoding='utf-8') as f: yaml.safe_dump(new_content, f, explicit_start=True, explicit_end=True, allow_unicode=True, indent=4)
Dumps a dictionary as a yaml document to the summary file. Each call to this method dumps a separate yaml document to the same summary file associated with a test run. The content of the dumped dictionary has an extra field `TYPE` that specifies the type of each yaml document, which is the flag for parsers to identify each document. Args: content: dictionary, the content to serialize and write. entry_type: a member of enum TestSummaryEntryType. Raises: recoreds.Error: An invalid entry type is passed in.
github-repos
def _head(self, client_kwargs): return _handle_http_errors( self.client.request( 'HEAD', timeout=self._TIMEOUT, **client_kwargs)).headers
Returns object HTTP header. Args: client_kwargs (dict): Client arguments. Returns: dict: HTTP header.
juraj-google-style
def same_dynamic_shape(a, b): a = tf.convert_to_tensor(value=a, name='a') b = tf.convert_to_tensor(value=b, name='b') def all_shapes_equal(): return tf.reduce_all(input_tensor=tf.equal(tf.concat([tf.shape(input=a), tf.shape(input=b)], 0), tf.concat([tf.shape(input=b), tf.shape(input=a)], 0))) return tf.cond(pred=tf.equal(tf.rank(a), tf.rank(b)), true_fn=all_shapes_equal, false_fn=(lambda : tf.constant(False)))
Returns whether a and b have the same dynamic shape. Args: a: `Tensor` b: `Tensor` Returns: `bool` `Tensor` representing if both tensors have the same shape.
codesearchnet
def u2handlers(self): return_handlers = suds.transport.http.HttpTransport.u2handlers(self) return_handlers.extend(self.handlers) return return_handlers
Get a collection of urllib2 handlers to be installed in the opener. Returns: A list of handlers to be installed to the OpenerDirector used by suds.
codesearchnet
def parse(self, argument): if isinstance(argument, list): return argument elif not argument: return [] else: if self._comma_compat: argument = argument.replace(',', ' ') return argument.split()
Parses argument as whitespace-separated list of strings. It also parses argument as comma-separated list of strings if requested. Args: argument: string argument passed in the commandline. Returns: [str], the parsed flag value.
juraj-google-style
def add_spectra(self, spectra_dict, key_sort_func=None): if key_sort_func: keys = sorted(spectra_dict.keys(), key=key_sort_func) else: keys = spectra_dict.keys() for label in keys: self.add_spectra(label, spectra_dict[label])
Add a dictionary of doses, with an optional sorting function for the keys. Args: dos_dict: dict of {label: Dos} key_sort_func: function used to sort the dos_dict keys.
juraj-google-style
def __init__( self, app=None, base_url='http: namespaces=DEFAULT_NAMESPACES): self.app = app self.namespaces = namespaces self.base_url = None if app is not None: self.init_app(app) if 'FEDORA_BASE_URL' in app.config: self.base_url = app.config.get('FEDORA_BASE_URL') if self.base_url is None: self.base_url = base_url if self.base_url.endswith("/"): self.base_url = self.base_url[:-1] self.transaction = []
Initializes a Repository object Args: app(Flask): Flask app, default is None base_url(str): Base url for Fedora Commons, defaults to localhost:8080. namespaces(list): List of namespace tuples of prefix, uri for each namespace in Fedora
juraj-google-style
def cancel_job_button(self, description=None): if (not hasattr(self, 'jobId')): return try: import ipywidgets as widgets if (not description): description = 'Cancel job: ' description += (self.name if hasattr(self, 'name') else self.job.name) button = widgets.Button(description=description, button_style='danger', layout=widgets.Layout(width='40%')) out = widgets.Output() vb = widgets.VBox([button, out]) @out.capture(clear_output=True) def _cancel_job_click(b): b.disabled = True print((('Cancelling job: id=' + str(self.job.id)) + ' ...\n'), flush=True) try: rc = self.job.cancel() out.clear_output() if rc: print((((('Cancelled job: id=' + str(self.job.id)) + ' : ') + self.job.name) + '\n'), flush=True) else: print((((('Job already cancelled: id=' + str(self.job.id)) + ' : ') + self.job.name) + '\n'), flush=True) except: b.disabled = False out.clear_output() raise button.on_click(_cancel_job_click) display(vb) except: pass
Display a button that will cancel the submitted job. Used in a Jupyter IPython notebook to provide an interactive mechanism to cancel a job submitted from the notebook. Once clicked the button is disabled unless the cancel fails. A job may be cancelled directly using:: submission_result = submit(ctx_type, topology, config) submission_result.job.cancel() Args: description(str): Text used as the button description, defaults to value based upon the job name. .. warning:: Behavior when called outside a notebook is undefined. .. versionadded:: 1.12
codesearchnet
def floodlight_report(config, task: dict, floodlight_id: int) -> int: account_id, subaccount_id = parse_account(config, task['auth'], task['account']) name = 'Floodlight Monitor %s %s ( StarThinker )' % (account_id, floodlight_id) if config.verbose: print('FLOODLIGHT MONITOR REPORT: ', name) report = report_build(config, task['auth'], task['account'], {'kind': 'dfareporting return report['id']
Create a report for a specific floodlight if it does not exist. Args: floodlight_id - the floodlight being monitored Returns: The id of the created report.
github-repos
def get_tensors(object_): if torch.is_tensor(object_): return [object_] elif isinstance(object_, (str, float, int)): return [] tensors = set() if isinstance(object_, collections.abc.Mapping): for value in object_.values(): tensors.update(get_tensors(value)) elif isinstance(object_, collections.abc.Iterable): for value in object_: tensors.update(get_tensors(value)) else: members = [value for (key, value) in inspect.getmembers(object_) if (not isinstance(value, (collections.abc.Callable, type(None))))] tensors.update(get_tensors(members)) return tensors
Get all tensors associated with ``object_`` Args: object_ (any): Any object to look for tensors. Returns: (list of torch.tensor): List of tensors that are associated with ``object_``.
codesearchnet