Anyone who thought UK is basically training their healthcare professionals to work somewhere else?

So, me (33F) and my husband (35M) immigrated to the UK. We're now british citizens and we work in the NHS. Working in the UK has its ups and downs but atm, its mostly just frustrating. We're planning to find work overseas soon (high likely in the middle east = no tax, tripple salary, more allowances) just to be able to fund a mortgage and to actually live. Its ridiculous how I feel like the UK does not think that healthcare is essential and our salary does not match our job descriptions. Anyone who's thinking of leaving as well?