No worries!
Getting an entire grid for lat and lon, can be a little confusing, because most of us are familiar with grids that run along latitude and longitude values. However, that's not necessarily the case: if your data uses a different map projection, you might have, say, a y index where not only the latitude varies as you travel along it, but also the longitude.
If you have a look at these values, you might find that all of the longitude values are the same in one dimension of the grid, and all of the latitude values are the same in the other dimension. If you're using ncdf4, you unfortunately have to subset grids using the grid indices, not actual lat/lon values. But if you're using tidync, that package'll actually let you make queries directly on the lat/lof on a NetCDF like:
tidync('myfile.nc') %>%
hyper_filter(lat = lat < -30, lon = lon > 20)
Rather than using str() on the NetCDF object, are you able to extract the land variable with ncvar_get() and use str() on that? NetCDF files are handled kind of like databases (you open a connection to it first, then extract parts of the actual contents).
I'm gonna be honest with you: it's likely you're going to have go to back to the maintainer of this dataset, or the maker of the software that produced this file, and ask them how the file is structured. One of the advantages of NetCDFs is that they can store metadata like the units of each variable or dimension (as you can see with some of the other variables there in the str() output), or notes about how the data should be interpreted, and a lot of that is missing here. This is very far from what R folks would call 'tidy' data, and if you can just talk to the data source, it'll probably save us all a lot of time 