Sus20200
Sus20200

Reputation: 337

Implementing clipping function using Keras backend

I would like to implement a clipping function using Keras backend:

f(x) = 

x, if 0<x<1 
1, if x>1 
0, otherwise 

I can do it in numpy as follows:

def myclip(x): 
    import numpy as np 

    return np.int64((x>=1)==True)+np.multiply(np.int64( np.logical_and(x>0, x<1)==True),x) 

It could be

def myclipK(x): 

    from tensorflow.keras import backend as K 


    return K.int64((x>=1)==True)+K.multiply(np.int64( K.logical_and(x>0, x<1)==True),x) 

But, we don't have things like K.int64 or K.multiply, or K.logical.

How can I do this

Upvotes: 3

Views: 514

Answers (2)

karandeep36
karandeep36

Reputation: 336

You can make use of keras.backend functionality.

It has functionalities like clip(), sum(), greater_than etc.

You will have to arrange your equation using keras.backend and it should work.

Upvotes: 1

eugen
eugen

Reputation: 1329

Looking at the Keras documentation, you cannot do it currently. Below are all the currently available functions:

_broadcast_normalize_batch_in_training
_fused_normalize_batch_in_training
_get_available_gpus
_get_current_tf_device
_GRAPH_LEARNING_PHASES
_GRAPH_UID_DICTS
_has_nchw_support
_is_current_explicit_device
_LOCAL_DEVICES
_MANUAL_VAR_INIT
_preprocess_conv1d_input
_preprocess_conv2d_input
_preprocess_conv3d_input
_preprocess_padding
_regular_normalize_batch_in_training
_SESSION
_TfDeviceCaptureOp
_to_tensor
abs
all
any
arange
argmax
argmin
batch_dot
batch_flatten
batch_get_value
batch_normalization
batch_set_value
bias_add
binary_crossentropy
cast
categorical_crossentropy
clear_session
clip
concatenate
constant
conv1d
conv2d
conv2d_transpose
conv3d
conv3d_transpose
cos
count_params
ctc_batch_cost
ctc_decode
ctc_label_dense_to_sparse
cumprod
cumsum
depthwise_conv2d
dot
dropout
dtype
elu
equal
eval
exp
expand_dims
eye
flatten
foldl
foldr
function
Function
gather
get_session
get_uid
get_value
get_variable_shape
gradients
greater
greater_equal
hard_sigmoid
identity
in_test_phase
in_top_k
in_train_phase
int_shape
is_keras_tensor
is_placeholder
is_sparse
is_tensor
l2_normalize
learning_phase
less
less_equal
local_conv1d
local_conv2d
log
logsumexp
manual_variable_initialization
map_fn
max
maximum
mean
min
minimum
moving_average_update
name_scope
ndim
normalize_batch_in_training
not_equal
one_hot
ones
ones_like
permute_dimensions
placeholder
pool2d
pool3d
pow
print_tensor
prod
py_all
py_any
py_slice
py_sum
random_binomial
random_normal
random_normal_variable
random_uniform
random_uniform_variable
relu
repeat
repeat_elements
reset_uids
reshape
resize_images
resize_volumes
reverse
rnn
round
separable_conv1d
separable_conv2d
set_learning_phase
set_session
set_value
shape
sigmoid
sign
sin
slice
softmax
softplus
softsign
sparse_categorical_crossentropy
spatial_2d_padding
spatial_3d_padding
sqrt
square
squeeze
stack
std
stop_gradient
sum
switch
tanh
temporal_padding
tile
to_dense
transpose
truncated_normal
update
update_add
update_sub
var
variable
zeros
zeros_like

Therefore, you can push your own implementation and contribute to Keras in github. Sad but true.

Upvotes: 1

Related Questions