Skip to content

Conversation

TallJimbo
Copy link
Member

This is a small cleanup (reducing reliance on a middleware interface we'd prefer to stop treating as public) that should only be merged after the 'camera' datasets in all LATISS repos have been rebuilt with post-DM-20746 code.

@TallJimbo TallJimbo changed the title Defer to camera geometry instead of raw formatter for intiial WCS. DM-20746: Defer to camera geometry instead of raw formatter for intiial WCS. Jun 20, 2025
Copy link

@cmsaunders cmsaunders left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.

@cmsaunders
Copy link

Rereading the comments on the Jira ticket, I see that this is not supposed to be merged with the other PRs.

@kfindeisen
Copy link
Member

kfindeisen commented Oct 17, 2025

Is this still relevant?

(Also, typo in the commit message)

This is a small cleanup (reducing reliance on a middleware interface
we'd prefer to stop treating as public) that should only be merged
after the 'camera' datasets in all LATISS repos have been rebuilt with
post-DM-20746 code.
@TallJimbo
Copy link
Member Author

Yes, this is still relevant, but I haven't been paying attention to whether the precondition has been met (though if prompt processing no longer cares about LATISS, that's another way to satisfy it).

I've rebased and fixed the commit message.

@kfindeisen
Copy link
Member

kfindeisen commented Oct 17, 2025

We are still using LATISS data for integration testing, but I have no idea how to recognize a "rebuilt camera".

Will it fail spectacularly if we try to run it on an incompatible repo?

@TallJimbo
Copy link
Member Author

Yes, I expect you'll see astrometry-solving failures a large fraction of the time if the camera hasn't been rebuilt.

If the camera has been rebuilt, you should see:

butler.get("camera").getFocalPlaneParity()

return True instead of False. The reason this is only relevant for LATISS (and DECam) is that our other cameras will always return False.

@kfindeisen
Copy link
Member

kfindeisen commented Oct 17, 2025

Basic tests (upload.py) work fine. A larger run of 38 LATISS visits produced 3 instances of NoPsfStarsToStarsMatchError, 2 of ObjectSizeNoGoodSourcesError, and 1 of BadSubtractionError.

A run of 38 (independently selected) LATISS visits on latest produced 5 instances of NoPsfStarsToStarsMatchError, 2 of ObjectSizeNoGoodSourcesError, 2 of BadSubtractionError, and 1 of PsfexTooFewGoodStarsError. So I think we're good?

@TallJimbo
Copy link
Member Author

That seems like a fairly high failure rate, but I don't have a good baseline for LATISS. Is this /repo/main? I can just go inspect whatever repo is relevant manually, and then we can be sure.

@TallJimbo
Copy link
Member Author

It looks like the camera in /repo/main has not been rebuilt since this change. I'll go see if I can figure out who is maintaining calibrations for LATISS and ask if we can just do that.

@kfindeisen
Copy link
Member

Re: the failure rate, the imaging pipeline has never been optimized for LATISS. That's why I did the latest run (i.e., without this PR) for comparison.

We have our own dev repo, but I don't know when or from where the camera was last sourced. @hsinfang might know.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants