Tuesday, 12 August 2025

Solution to VBA ThisWorkbook.Path and ActiveWorkbook.Path commands not working properly in Excel on OneDrive (as of December 2024)

Introduction

This is a well-known issue among VBA users:
When synchronizing on OneDrive, which is widely used by Microsoft users, Excel VBA's ThisWorkbook.Path and ActiveWorkbook.Path do not work as expected.

I found a good solution, so I will share it here.

Problem

Let me summarize the issue.

  • On OneDrive, Excel VBA's ThisWorkbook.Path and ActiveWorkbook.Path return a 'http://~' cloud URL.
  • If this is used in VBA, file access fails and returns an error.
  • The cloud URL differs depending on the subscription type.
    • OneDrive for Business: https://company-my.sharepoint.com/personal/username_domain_com/Documents/filepath
    • OneDrive Personal: https://d.docs.live.net/CID_number/filepath

It's a bit troublesome.
Is Microsoft abandoning VBA and moving to Office Scripts or Python...?

Solution

Many websites have proposed solutions to this problem.

The best method is to use the following code.
The standard module file (funcOneDrive.bas) can be downloaded here, and the Excel file is available here.

Public Function OneDrivePath(S_Url As String) As String

    Const Cns_OneDriveCommercialPattern As String = "my.sharepoint.com"
    ' Right-hand value for Like operator to determine if the URL is for OneDrive for Business

    Dim S_pathSeparator As String
    Dim S_OneDriveCommercialPath As String
    Dim S_OneDriveConsumerPath As String

    Dim S_PathPosition As Long

    'If the argument is not a URL, assume it is a local path and return it as is.
    If Not (S_Url Like "https://*") Then
        OneDrivePath = S_Url
        Exit Function
    End If

    S_pathSeparator = Application.PathSeparator

    'Path for OneDrive for Business (Commercial)
    S_OneDriveCommercialPath = Environ("OneDriveCommercial")
    If (S_OneDriveCommercialPath = "") Then
        S_OneDriveCommercialPath = Environ("OneDrive")
    End If

    ' Path for personal OneDrive
    S_OneDriveConsumerPath = Environ("OneDriveConsumer")
    If (S_OneDriveConsumerPath = "") Then
        S_OneDriveConsumerPath = Environ("OneDrive")
    End If

    ' For business OneDrive: S_Url = "https://[company]-my.sharepoint.com/personal/[username]_domain_com/Documents/[filepath]"
    If (S_Url Like "*" & Cns_OneDriveCommercialPattern & "*") Then

        S_PathPosition = InStr(1, S_Url, "/Documents") + 10                         '10 = Len("/Documents")
        OneDrivePath = S_OneDriveCommercialPath & Replace(Mid(S_Url, S_PathPosition), "/", S_pathSeparator)

    ' For personal OneDrive: S_Url = "https://d.docs.live.net/[CID_number]/[file_path]"
    Else
        '********************************************************************************
        '         1         2         3         4         5         6         7         8
        '12345678901234567890123456789012345678901234567890123456789012345678901234567890
        'https://d.docs.live.net/f53c0b88b096e170/desktop/Excel_Ichiran_Ver3.4
        '********************************************************************************

        S_PathPosition = InStr(9, S_Url, "/")                                       '9 = Len("https://") + 1
        S_PathPosition = InStr(S_PathPosition + 1, S_Url, "/")
        OneDrivePath = S_OneDriveConsumerPath & Replace(Mid(S_Url, S_PathPosition), "/", S_pathSeparator)

    End If

End Function

When referencing files on OneDrive, the usual ThisWorkbook.Path and ActiveWorkbook.Path commands:

Debug.Print ThisWorkbook.Path
Debug.Print ActiveWorkbook.Path

return a URL (https://company-my.sharepoint.com/personal/username_domain_com/Documents/filepath), but using the defined function:

Debug.Print OneDrivePath(ThisWorkbook.Path)
Debug.Print OneDrivePath(ActiveWorkbook.Path)

returns the file path (e.g., C:\Users\username\OneDrive - company), allowing file access within your program.

Supplement: How to get the OneDrive path

OneDriveCommercial Root path of the OneDrive folder for organization accounts
OneDriveConsumer Root path of the OneDrive folder for personal accounts
OneDrive Same as OneDriveCommercial

It's a bit confusing, perhaps because you can use both organization and personal accounts.

Other reference sites

For creating HTML tables, use tablesgenerator https://www.tablesgenerator.com/.

GitHub

https://github.com/computational-sediment-hyd/onedrivepath

Thursday, 9 June 2022

What is Sediment Hydraulics? part 1 : What is Sediment Hydraulics?

River during flood

  • During a flood, a river is brown and muddy. This is due to the sediment transport with water.

in normal

flooding

Fig. A river in normal and flooding conditions [https://www.river.go.jp/index]

  • The amount (concentration) of sediment transport depends on the flow of the river and catchment characteristics.

Fig. Confluence of Arve and Rhone rivers : [https://unsplash.com/]

Fig. Confluence of Tama and Nippara rivers

Sediment transport in a river during a flood

There are different sizes of sediment during a flood: coarse sediment rolling along the river bed, slightly fine sediment jumping from the river bed, and much finer sediment suspending in water.

What is “Sediment Hydraulics”?

  • A discipline that addresses problems caused by sediment transport in a river
  • Although it is a part of hydraulics, it is often treated independently as “Sediment Hydraulics,” “Hydraulics of Sediment Transport,” etc.
  • Many points are not yet clarified academically and have not been systematized.
  • Difficulty is high in hydraulics.
  • These classes of hydraulics are not really studied.

Key words of “Sediment Hydraulics”

“Sediment Transport”

  • The sentiment moved by river flow

“River Bed Variation”

  • The riverbed shape changes due to sediment transport imbalance.

Friday, 11 March 2022

Implementation of L-BFGS method (Limited memory BFGS method) using python

Introduction

We write about the L-BFGS method (Limited-memory BFGS method, BFGS method is one of quasi-Newtonian solving method) most commonly used for the optimization of nonlinear problems. The quasi-Newton method is a type of gradient-based method. Stochastic gradient descent methods are used in deep learning but the L-BFGS method is also commonly used in optimization problems and in machine learning beyond deep learning. But there are not many open libraries for the L-BFGS method.

Possible reasons are:

  • Many parameters are available and need to be tuned to fit the application.
  • Computations are prone to instability and require many error traps.

We will focus in this paper on the main part of the L-BFGS method, which is to update the Hessian matrix, instead of a generic model.

Derivation of the L-BFGS method

Google search provides much information, but Wikipedia has more information.

Limited-memory BFGS - Wikipedia

There is also an interesting paper on speeding up the L-BFGS method, though somewhat different from what is discussed here.

https://papers.nips.cc/paper/5333-large-scale-l-bfgs-using-mapreduce.pdf

Implementation of the L-BFGS method

We encode the above formula as is. As usual, the programming language is python.

import numpy as np

def lbfgs(x, f, g, stepsize, maxiterate, memorysize, epsilon):

    def searchdirection(s, y, g):
        q = -np.copy(g)
        num = len(s)
        a = np.zeros(num)
        
        if num==0 : return q
        
        for i in np.arange(num)[::-1]:
            a[i] = np.dot(s[i], q)/np.dot(y[i], s[i])
            q -= a[i]*y[i]
        
        q = np.dot(s[-1], y[-1]) / np.dot(y[-1], y[-1]) * q
            
        for i in range(num) : 
            b = np.dot(y[i], q) / np.dot(y[i], s[i])
            q += ( a[i] - b ) * s[i]
            
        return q
    
    outx = []
    s = np.empty((0, len(x)))
    y = np.empty((0, len(x)))
    xold = x.copy() 
    gold = g(xold) 
    J1 = f(xold)
    mmax = memorysize
    sp = stepsize
    print( 'f= ', J1 )
    
    outx.append(xold)
    for num in range(maxiterate):
        if np.linalg.norm(gold) < epsilon : 
            print('g=',np.linalg.norm(gold))
            break
            
        d = searchdirection(s, y, gold)
        
        sp = stepsize*0.01 if num == 0 else stepsize           
        
        xnew = xold + sp * d
        gnew = g(xnew) 
        J2 = f(xnew)
        
        J1 = J2
        si, yi = xnew - xold, gnew - gold
        if len(s) == mmax :
            s = np.roll(s, -1, axis=0)
            y = np.roll(y, -1, axis=0)
            s[-1] = si
            y[-1] = yi
        else:
            s = np.append(s, [si], axis=0)
            y = np.append(y, [yi], axis=0)
        
        xold, gold = xnew, gnew
        
        print( 'iterate= ', num, 'f= ', J1, ' stepsize= ', sp )
        outx.append(xold)
        
    return xold, outx

Its arguments are the variable x, the evaluation function f and its gradient g, plus the memorysize parameters for step width, maximum value iteration, convergence condition, and the number of times previous history is used when updating Hessian matrix.

There is no coding difficulty, but it is interesting that the array order is updated by numpy’s roll function when the array size exceeds memory.

We take the step width small enough in the first iteration. This computational example is okay when it is large, but this is because there is a divergence possibility if the evaluation function is a differential equation.

Calculation example

Use the function of the L-BFGS method coded in the following equation.

$$ \begin{aligned} X & =\binom{x}{y} \\ f & =5 x^2-6 x y+5 y^2-10 x+6 y \\ g & =\frac{\partial f}{\partial X}=\binom{\frac{\partial f}{\partial x}}{\frac{\partial f}{\partial y}}=\binom{10 x-6 y-10}{10 y-6 x+6} \end{aligned} $$

The conditions of calculation are as follows.

  • The initial value is $x=-2.5,y=0$.
  • The optimal value is $x=1,y=0.5$.
  • The step size is 1.
%%time
f = lambda x : 5.0*x[0]**2.0 - 6.0*x[0]*x[1] + 5.0*x[1]**2 - 10.0*x[0] + 6.0*x[1]
g = lambda x : np.array( [ 10.0 * x[0] - 6.0 *x[1] -10.0, 10.0 * x[1] - 6.0 *x[0] + 6.0 ] )

# initial value
x = np.array( [-2.5, 0.0] )

xx, out = lbfgs(x, f, g, stepsize=1.0, maxiterate=20, memorysize=5, epsilon=10.0**(-8))
print(xx)

> f=  56.25
> iterate=  0 f=  40.864  stepsize=  0.01
> iterate=  1 f=  -1.3307052922247742  stepsize=  1.0
> iterate=  2 f=  -3.1273465839434973  stepsize=  1.0
> iterate=  3 f=  -4.999999466916309  stepsize=  1.0
> iterate=  4 f=  -4.999999999941274  stepsize=  1.0
> iterate=  5 f=  -4.999999999999999  stepsize=  1.0
> iterate=  6 f=  -5.0  stepsize=  1.0
> g= 8.580967415593408e-10
> [1.00000000e+00 1.53469307e-10]
> Wall time: 4 ms

Iteration converges after 6 iterations. The process is illustrated in the following figure.

Comparison with scipy.optimize.fmin_l_bfgs_b

Scipy provides the L-BFGS method: scipy.optimize.fmin_l_bfgs_b. We compare it with it for confirmation.

%%time
f = lambda x : 5.0*x[0]**2.0 - 6.0*x[0]*x[1] + 5.0*x[1]**2 - 10.0*x[0] + 6.0*x[1]
g = lambda x : np.array( [ 10.0 * x[0] - 6.0 *x[1] -10.0, 10.0 * x[1] - 6.0 *x[0] + 6.0 ] )

out2 = []
def callback(xk):
    out2.append(xk)
    print(xk)
    
# initial value
x = np.array( [-2.5, 0.0] )
out2.append(x)
xx, y, d = opt.fmin_l_bfgs_b(f, x, fprime=g, m=5, iprint=0, epsilon=10.0**(-8), maxiter=20, maxls=1, callback=callback)

> [-1.64250707 -0.51449576]
> [ 0.10902424 -1.00977252]
> [ 0.4446687  -0.73580975]
> [ 1.00115755 -0.00273162]
> [ 1.00005759e+00 -1.68279906e-05]
> [1.00000095e+00 7.64484423e-07]
> Wall time: 3 ms

As a result, the code generated by the home is almost the same in terms of computation time, the number of iterations, and the convergence process.

Memo

scipy.optimize.fmin_l_bfgs_b. contains thousands of lines of Fortran90 code in the following library.

L-BFGS-B Nonlinear Optimization Code

We used this once to write Fortran03 code, but this was a lot of work.
It is highly functional and supports a wide variety of instances, but a few dozen lines of code suffice for simple problems.

Comparison with the gradient descent method

We also show a comparison with a simple gradient descent method. There are two cases with step-size 0.05 and 0.1, respectively. Both cases were computed up to 20 times.

%%time
f = lambda x : 5.0*x[0]**2.0 - 6.0*x[0]*x[1] + 5.0*x[1]**2 - 10.0*x[0] + 6.0*x[1]
g = lambda x : np.array( [ 10.0 * x[0] - 6.0 *x[1] -10.0, 10.0 * x[1] - 6.0 *x[0] + 6.0 ] )

# initial value
x = np.array( [-2.5, 0.0] )

out3 = []
out3.append(x)
for _ in range(20):
    xx = x - 0.05*g(x)
    out3.append(xx)
    x = xx.copy()
    
# initial value
x = np.array( [-2.5, 0.0] )

out4 = []
out4.append(x)
for _ in range(20):
    xx = x - 0.1*g(x)
    out4.append(xx)
    x = xx.copy()

In both cases, the paths converge to a good point, but their paths are very different. This step-size is important for the gradient descent method.

Discussion and Summary

  • We have written about a simple implementation of the L-BFGS method. We will write a code that works for real problems, but the basics are the same.
  • One advantage of the L-BFGS method is that the important step size setting of the gradient descent method can be set appropriately. A very important point is that if you look at generic code the majority of the code will be part of setting the step-size (line-search), so being able to omit this is advantageous.
  • Also, Line-search uses Armijo and Wolf conditions, in which the computation of f and g is performed multiple times. It is not a problem in simple cases, but in fluid optimization problems, it often takes tens of minutes to several hours to compute f and g. Hence, the L-BFGS method has advantages in terms of reduced computational time.
  • If necessary we will write an example of an application of the L-BFGS method to a fluid problem (Lagrange’s undetermined multiplier method).

Github

github.com

Solution to VBA ThisWorkbook.Path and ActiveWorkbook.Path commands not working properly in Excel on OneDrive (as of December 2024)

Introduction This is a well-known issue among VBA users: When synchronizing on OneDrive, which is widely used by Microsoft user...