Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Grid imprinting when running C192 atmos with 1/4 deg ocean and seaice #2466

Open
guillaumevernieres opened this issue Oct 11, 2024 · 17 comments
Assignees
Labels
bug Something isn't working

Comments

@guillaumevernieres
Copy link

guillaumevernieres commented Oct 11, 2024

Description

We can see the imprinting of a grid on the sea ice and ocean. The figure attached depicts the seaice concentration standard deviation of a 30 member ensemble after a 6 hour forecast.

The initial conditions were created by:
1 - interpolating the gfs ensemble members from operations from C384 to C192
2 - adding the same ocean and seaice initial conditions for the 30 members
3 - omitting the mediator restarts
oops-seice-stddev

To Reproduce:

I don't think this is machine dependent, but my test was done on Hercules.

  • Use a configuration of the coupled UFS that has FV3, MOM6 and CICE6
  • use C192 for FV3 and 1/4 deg for MOM6 and CICE
  • bootstrap/omit the mediator restart
@guillaumevernieres guillaumevernieres added the bug Something isn't working label Oct 11, 2024
@guillaumevernieres
Copy link
Author

pinging @DeniseWorthen and @junwang-noaa

@junwang-noaa
Copy link
Collaborator

@DeniseWorthen Is this issue related to how ocean/ice grids/meshes are created or the interpolation issue between the two components?

@DeniseWorthen
Copy link
Collaborator

I believe the issue is related to the conservative mapping we do for ATM states->OCN and ICE. And at least one reason we don't do that is the very old issue of "stripes" that Ufuk mentioned recently. That issue is NOAA-EMC/CMEPS#38 (comment)

@benjamin-cash
Copy link

@guillaumevernieres - could you point me to the ICs on Hercules and hash of the model/workflow you are using? I have the model set up to run C192mx025 on Frontera and can see if I get the same result. I'm also having an issue with the model crashing on start up that I sort of suspect is related to the ICs I'm using, so this would also let me test that.

@guillaumevernieres
Copy link
Author

@guillaumevernieres - could you point me to the ICs on Hercules and hash of the model/workflow you are using? I have the model set up to run C192mx025 on Frontera and can see if I get the same result. I'm also having an issue with the model crashing on start up that I sort of suspect is related to the ICs I'm using, so this would also let me test that.

Warm restarts for m001:

/work/noaa/da/gvernier/sandboxes/global-workflow/sorc/gdas.cd/build/gdas/test/gw-ci/GFSv17-3DVAR-C384mx025/COMROOT/GFSv17-3DVAR-C384mx025/enkfgdas.20210630/12/mem001/model

Commit of the UFS (ufs_model.fd): commit 6a4e09e94773ffa39ce7ab6a54a885efada91f21

@benjamin-cash
Copy link

@guillaumevernieres - It looks like I will need permissions added to see those files, starting with the sandboxes directory:

[cash@hercules-login-3 ~]$ ls -l /work/noaa/da/gvernier
total 679420
-rwxr-x---  1 gvernier da 690850711 Jun 13  2022 Anaconda3-2022.05-Linux-x86_64.sh
drwxr-sr-x 11 gvernier da      4096 Apr 17 15:10 ai
drwxr-s--- 28 gvernier da      4096 Jun 13  2022 anaconda3
drwxr-sr-x  8 gvernier da      4096 Sep 30 17:07 ensda
-rw-r-----  1 gvernier da     11021 Jun 13  2022 oceanview.yaml
drwxr-s---  6 gvernier da      4096 Oct 10 18:23 sandboxes

@guillaumevernieres
Copy link
Author

@guillaumevernieres - It looks like I will need permissions added to see those files, starting with the sandboxes directory:

[cash@hercules-login-3 ~]$ ls -l /work/noaa/da/gvernier
total 679420
-rwxr-x---  1 gvernier da 690850711 Jun 13  2022 Anaconda3-2022.05-Linux-x86_64.sh
drwxr-sr-x 11 gvernier da      4096 Apr 17 15:10 ai
drwxr-s--- 28 gvernier da      4096 Jun 13  2022 anaconda3
drwxr-sr-x  8 gvernier da      4096 Sep 30 17:07 ensda
-rw-r-----  1 gvernier da     11021 Jun 13  2022 oceanview.yaml
drwxr-s---  6 gvernier da      4096 Oct 10 18:23 sandboxes

Sure, let's take this offline @benjamin-cash . My email is [email protected]

@guillaumevernieres
Copy link
Author

guillaumevernieres commented Oct 11, 2024 via email

@sanAkel
Copy link

sanAkel commented Oct 12, 2024

Let's try this: /work/noaa/da/gvernier/mem001 not sure this will work since da and gvernier are definitely da --- Guillaume Vernieres

It works for me!

@CatherineThomas-NOAA
Copy link

@junwang-noaa @LarissaReames-NOAA: There was some discussion in our GFS workflow meeting last week about a beta snapshot of ESMF that may potentially address this issue. Is there a test version installed somewhere that the marine DA could use to test?

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Nov 14, 2024

You may be referring to a snapshot that Ufuk has tested which resolves his "stripes" issue. That issue is tangentially related to the grid imprint issue. That is, we have not been using bilinear mapping of states from ATM->ICE (which ICE then uses to calculate it's own fluxes) because of the underlying stripes issue. But Guillaume reported that a test branch using bilinear mapping did not resolve the issue.

I've been using Ufuk's build on hercules with a modification (below) to the modulefiles, but I still see an "imprint" using a C48-1deg OCN/ICE.

But, heads up, I'm going to need to put further investigation and debugging on hold while I tackle issue #2486.

diff --git a/modulefiles/ufs_hercules.intel.lua b/modulefiles/ufs_hercules.intel.lua
index 455ea4d0..05823161 100644
--- a/modulefiles/ufs_hercules.intel.lua
+++ b/modulefiles/ufs_hercules.intel.lua
@@ -18,6 +18,10 @@ load("ufs_common")
 nccmp_ver=os.getenv("nccmp_ver") or "1.9.0.1"
 load(pathJoin("nccmp", nccmp_ver))

+unload("esmf/8.6.0")
+prepend_path("MODULEPATH", "/work2/noaa/nems/tufuk/LAND/esmf/modulefiles")
+load(pathJoin("esmf", "8.6.0"))
+
 setenv("CC", "mpiicc")
 setenv("CXX", "mpiicpc")
 setenv("FC", "mpiifort")diff --git a/modulefiles/ufs_hercules.intel.lua b/modulefiles/ufs_hercules.intel.lua

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Nov 22, 2024

A brief update: I've set up a C96mx025 case in order to exacerbate the contribution from mis-matched grid resolutions. I've been testing the impact of using 2nd order conservative mapping from ATM->OCN. Bob (ESMF) had told me via email

Have you tried 2nd order conservative (i.e. ESMF_REGRIDMETHOD_CONSERVE_2ND)? It’ll also give you a smoother destination field. The issue with it though is, like patch regridding, it’s not monotonic. I.e., the output values are not guaranteed to be within the range of the input. In any case, it might be interesting to give it a try and see if it improves things.

I've implemented 2nd order mapping and applied it to a single ATM->OCN field, the LWNET. I set up a test case where every component is running at the same coupling frequency in order to eliminate any time-averaging/smoothing and I've been looking at the mediator history files at every timestep (ie, each step through the coupling loop). The following figure shows the base case on the left, where the LWNET is mapped conservative and on the right using 2nd conservative. I'm plotting the variance of the field over the first ~2days on the same colorscale.

Screenshot 2024-11-22 at 3 31 11 PM

The 2nd order field is smoother in many ways, although not always...see for example the Barents Sea, where the impact appears small vs the South Pacific where it appears much smoother to my eye.

My feature branch right now is set up w/ configurable variables (so I can turn things on/off w/o recompiling). I could create a hard-wired branch for @guillaumevernieres if you want to test the impact in your case.

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Dec 2, 2024

I've been able to reproduce the original "feature" in a c96-mx025 case for the ice fields using a single case (ie, not an ensemble). I'm using the basic set up from my last post, but in this case I'm using a set of kludged together ICs for mid-July.

The following is the ice-fraction tendency for the original case (left) and using patch mapping (right) for the ATM->ICE states. I'm now trying bilinear. All my tests are on Hercules, using Ufuk's module files containing an ESMF fix which resolves the "stripes" issue for mapping.

Screenshot 2024-12-02 at 1 34 57 PM

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Just to clarify, did you test patch interpolation with the original ESMF version that is shipped with UFS WM? Probably, that will show the similar results and using patch also removes the imprinting issue with that version. I just want to be sure that the fix in here is not coming from the fix that Bob put for the land component.

@DeniseWorthen
Copy link
Collaborator

@uturuncoglu Both cases I show above are using the modules from your build on hercules, so I'm pretty confident that is not what is giving the difference w/ patch.

@DeniseWorthen
Copy link
Collaborator

@uturuncoglu I can try the current ESMF build once I get my bilinear case through the Q. It's been sitting for several hours now....

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Dec 3, 2024

The test with bilinear mapping of states from ATM->ICE produced results nearly indistinquishable from the patch case. Only if I subtract the two cases can I see the tendencies are in fact different. This is the difference of the two mappings, on the same scale

Screenshot 2024-12-03 at 8 38 39 AM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: In Progress
Development

No branches or pull requests

7 participants