用Halley方法用张量流找到四次多项式的根

问题描述 投票:3回答:2

我刚开始学习张量流,基本上我正在学习使用张量流进行各种数值计算,而不是直接跳到它在ML中的使用。我在谷歌云平台上这样做,我遇到了这个问题而陷入困境。

Roots of 4th degree polynomial

我正在使用延迟评估,我可以使用占位符在tensorflow图中创建a0,a1,a2 ...,a4的实例,我也可以写出函数。但是如何使用tensorflow进行初始猜测呢?而且,即使我得到x0的值我如何使用tf.while_loop应用循环我通过它的文档和这个post,但我仍然无法知道如何继续。我试图找到一个类似问题或内容的帖子,但找不到使用tensorflow的帖子。如果我能获得洞察力或使用内在的tensorflow函数和命令,那将是很棒的:)在此先感谢!

python tensorflow google-cloud-platform polynomials
2个回答
2
投票

当我从here执行第一个例子时,我得到了这些值。请注意,方程式不同。

1.4999999969612645

1.411188880378198

1.4142132016669995

1.4142135623730898

但这似乎是一个很好的例子。

import tensorflow as tf

h = tf.constant(.00000001, dtype='float64')
eps = tf.constant(.000001, dtype='float64')
b = tf.constant(2.0, tf.float64)

def f(x):
    return tf.subtract( tf.multiply(x , x ) , 2. )

def fp(x):
    return  tf.divide( tf.subtract( f(tf.add(x, h)) ,
                                    f(x)
                                  ) ,
                       h
                     )

def fpp(x):
    return tf.divide( tf.subtract( fp( tf.add(x , h)) ,
                                   fp(x)
                                 ),
                       h
                     )

def cond(i, x_new, x_prev):
    return tf.logical_and( i < 5,
            tf.less_equal( tf.abs( tf.cast(tf.subtract( x_new ,
                                                       x_prev
                                                     ),dtype='float64')),
                          eps
                        )
                        )

def body( i, x_new, x_prev ):
    fx = f( x_prev )
    fpx = fp( x_prev )
    x_new = tf.subtract( x_prev ,
                         tf.divide( b * fx * fpx  ,
                                    tf.subtract(b * fpx * fpx,
                                                fx * fpp( x_prev )
                                               )
                                  )
                       )

    xnew = tf.Print(x_new, [x_new], message="The root is : ")

    with tf.control_dependencies([x_new,xnew]):
        x_prev = tf.identity(xnew)

    return [i + 1, xnew, x_prev ]

sess = tf.Session()
sess.run(tf.global_variables_initializer())


print( sess.run(tf.while_loop(cond, body, [1, b - fpp(b), b])) )

根是:[1.4999999969612645]

根是:[1.411188880378198]

根是:[1.4142132016669995]

根是:[1.4142135623730898]

[5, 1.4142135623730898, 1.4142135623730898]


0
投票

这是我的实施与急切评估,使用tensorflow GradientTape计算衍生物:

import tensorflow as tf
print("Tensorflow-CPU version is {0}".format(tf.__version__))

stop_variation = 0.00001 # Variation threshold from previous iteration to stop iteration

def halley(i, coeffs, x_new, x_prev):
    """
    Halley's Method implementation
    """

    a0 = coeffs[0]
    a1 = coeffs[1]
    a2 = coeffs[2]
    a3 = coeffs[3]
    a4 = coeffs[4]

    with tf.GradientTape() as g:
      g.watch(x_new)
      with tf.GradientTape() as gg:
        gg.watch(x_new)
        f =  a0 + a1 * x_new + a2 *  x_new**2 + a3 * x_new**3 + a4 * x_new**4
      df_dx = gg.gradient(f, x_new)
    df_dx2 = g.gradient(df_dx, x_new)

    numerator = 2 * f * df_dx
    denominator = 2 * df_dx*df_dx - f*df_dx2
    new_x  = x_new - (numerator/denominator)
    prev_x = x_new
    print("Root approximation in step {0} = {1}".format(i, new_x))
    return [i+1, coeffs, new_x, prev_x]

def condition(i, a, x_new, x_prev):
    variation = tf.abs(x_new - x_prev)
    return tf.less(stop_variation, variation)

tf.enable_eager_execution()

a = tf.constant(
        [2.0, -4.0, 1.0, 2.0, 0.0]
)  
x = tf.constant(40.0)
xprev = tf.constant(100.0)

roots =  tf.while_loop(     
          condition, 
          halley, 
          loop_vars=[1, a, x, xprev],
          maximum_iterations=1000)

print("Result after {0} iterations is {1}.".format(roots[0]-1, roots[2]))
© www.soinside.com 2019 - 2024. All rights reserved.