Which won't work. Devs will make games for the base machine since that's the one everyone has. There's a reason why consoles aren't upgradeable, not everyone, by any stretch, buy the upgrade.
It is like you read the word PC and **** your pants
. On the consumer end it would be no different to what we have now. You do realize you are basically running a PC with slightly different architecture and unified RAM right now don't you? Barring those differences it is exactly the same as what I am proposing. The massive benefit being that MS would not have to develop for two different machines, no more ESRAM juggling just to name the main problem (but there are many hurdles). All they need to do is tweak the settings for the best possible experience on that set box and it is exactly the same user friendly operation as the consoles we have now.
Same CPU, memory, motherboard, etc. The only potential change would be graphics. Exactly what we have now only in the future certain people who felt inclined to could update the card to run a game at a higher resolution, better AA solution, higher quality textures, etc. I'm not talking hundreds or thousands of combinations here, not even dozens. I'm talking a couple, in fact quite literally two. Easily maintainable and just as plug and play as anything before it and optimized for each combination both in the future and optionally for past release depending if the developer wants to or not. Irrelevant if the devs update anyway really as it would work just exactly the same as the previous card as drivers would be handled the same as always.
There would be no change to the past games in terms of compatibility and every future release would obviously shoot for the lowest common denominator being the Original box. But also could include the latest techniques, bells and whistles accessible to the new hardware. Basically, what we have now on PC but without the hassle of either developing for lots of different combinations on the developers side and no hassle on the consumers end because the game will have already been optimized and set in stone for a plug and play, out of the box experience that end.
It would be no difference to what we have now on console in that regard. None.
For example.
Say, (highly) hypothetically, MS released this box in 2018, release lots of games on it looking gorgeous at 1080p 60fps. 2-3 years down the line they release a new card which plays these games exactly the same at 4k 60fps with better textures, AA and so on. Just as, say, 4k tv's become the most common sets in homes. Now say Sony released their box around the same time with very similar specs but no way to upgrade. For a console gamer there would be two options. Wait 2-3 more years for Sony and MS to release full new hardware costing, say, £350-£500 so you can use your 2-3 year old TV the way it was meant to be. Or pay half that right there and then for glorious 4k 60fps of all your games both past and present? Past in the sense that they had a W10 PC release and already supported such features which is already the case now.
One last thing I would add about expansion port vs new console would be that, unless there is a major breakthrough, it is highly unlikely CPU's are going to make any kind of sizable leaps anytime soon. Unless it is a new exotic type of chipset with little to no infrastructure, costing thousands of pounds. But you never know of course, I just seriously doubt it and even then console manufacturers would be stupid to make the leap (PS2, PS3 dev's would vouch for that). Couple that with the fact that the vast majority of games do not even utilize the cores at all well on PC because of the vast differences between each PC. It would take 2-3 years to maximise the potential of a competitive PC CPU in a set environment a console offers. By that time graphics cards, just as they are now, will be making inroads as the ceiling has not yet been hit so to speak as with CPU's.
Once again, PC architecture in a plug and play console environment. Not the other way round as you misinterpreted.