Yes, Brad. I always have the phone on me but recently I've taken to leaving my Sony A6000 mirrorless camera at home when I go out. I'm pretty sure I'll get around to using it again, and maybe a lot, but right now I'm having too much fun with the phone. I was going out with the Sony with my 50mm f/1.8 lens practically glued to the front. The iPhone XS has two lenses: 26mm and 52mm, so I can get a wider range of street photos, and there's no real noticeable difference in quality between the A6000/50mm and the XS unless you pixel-peep at 100%.
Locking focus in the way you describe is a feature of all of the camera apps I use on the iPhone, but I tend to decide which format (in the sense of pixel ratio) to use until I get home and see the photos on a bigger screen, unless I'm shooting something where I want to process the photos and upload them while on location. And you're right about people's treatment of phone camera users: although it means that some people won't bother stopping if you're right in front of them taking a photo, those same people think your photography isn't important, so you can more easily get street photos, and especially street candids. That's always been the way with phones, though. While you're busy getting and locking focus and exposure, look quizzically up at street signs and people think you're using the Map app.
What you're calling "processor created out of focus back grounds" is part of what's known as computational photography. My iPhone has it. Both of the lenses have wide apertures, so you can get the background out of focus if your subject is close to you and a long way from the background. The only portrait-style photo I can show you as an example, is this one I took of Brexit Man (Steve Bray) outside the Houses of Parliament:
View: https://www.flickr.com/photos/garryknight/48794896706/in/dateposted/
But you can use the same function to get good non-portrait shots, too. Like this one of some leaves in a park:
View: https://www.flickr.com/photos/garryknight/49089965303/in/dateposted/
The function is really aimed at portraits, taken with the phone upright, and it looks out for a face to use as the subject. But you can fool it into accepting other subjects and blurring out the background in an acceptable way. And what's even better is that, since the blur is created by the software, you can change it after you've taken the photo. So, if the blur isn't looking good enough, I can (on the phone, or after I've copied it to my iPad) go into Edit mode and change the slider from f/1.4 all the way up to f/16 and use whatever works best.