Hi. This is a copy-paste (with minor adaptations) of a post in /r/dota that received some positive feedback. Apologies if this is in the wrong subforum.
The idea is that you would implement a system that allows the player to place a set of enemy heroes (bot controlled, obviously) somewhere on the map and assign sets of broad instructions to each (run away from player, engage player, engage player ally, etc). These instructions sets could be ordered or weighted in terms of priority as defined by the player and either act as modifiers or take complete precedence over "standard" bot priority weights. Obviously you could also assign skill sets, mana, health, levels and items for those heroes (and your hero). I think it might be worthwhile to allow the user to define certain "win conditions" as well.
To elaborate (and illustrate via example), imagine you want to practice your Puck play. You could set up three bots behind the tree line on the bottom lane and instruct them to all engage. It might be necessary to introduce a stochastic element to the bot behaviour to ensure that it wasn't too predictable. I'm guessing that the bots make decisions over some population density function, so that random elements in bot behaviour are probably there already? Anyway, you could practice your phasing, blinking, illusory-orbing, Eulsing, etc. all day long and repeat the scenario until you were satisfied. You could generate some incentive (some dopamine reward!) by assigning a win condition (e.g. get to this area, kill enemy hero z, make sure ally hero x survives, don't die) or some combination of objectives.
I really like the idea of such a system. It also might lend itself well to community design of various scenarios which could be made available through some channel. The best scenarios could be augmented to some official status. If the goal of some challenge was to maximise a particular variable (say you made a scenario which required you to farm as much GPM in x minutes with heroes a, b or c) there could be leaderboards that give the player an idea of where they stood in terms of skill at performing the particular mechanical skill the scenario is designed to test. Imagine being able to compare your performance in the 10 minute SF test to thousands of others (including pros). As an aside, the leaderboard idea is also kinda cool because it's a reminder that you're actually playing the same game as the pros! I'm not sure if I wanna know how unfavourably my 10 min SF test stacks up to the rest of the world, though! I think this sort of information should only be available to the player, with the top 100 or so being publicly visible (maybe the player can see a population density function with a little arrow pointing to their percentile or something, but I'm digressing here).
Obviously there would be some trade off between design simplicity and retaining the ability to test very specific skills but I think some sort of middle ground could be achieved. It would just require judicious design and a lot of community input. I think the track record on the first count is pretty excellent! Judicious design is Valve's bag. Community input is another thing, but I'm sure it will turn out fine, although I'm sure some creative chap will come up with a way to break the system or create something offensive. I have some ideas for intuitive user interfaces for the design component of this game mode (I suppose that's essentially what it is - "challenge mode" or "custom practise scenario") which I would be happy to elaborate if there was sufficient interest in the concept.
PS: Apologies if this has been mentioned before but it's pretty difficult to guess how a post detailing a similar concept would be titled (which makes searching for it challenging). Also apologies for the muddled style of the above text. I'm meant to be working (at my real job) and wrote this rather hastily.
The idea is that you would implement a system that allows the player to place a set of enemy heroes (bot controlled, obviously) somewhere on the map and assign sets of broad instructions to each (run away from player, engage player, engage player ally, etc). These instructions sets could be ordered or weighted in terms of priority as defined by the player and either act as modifiers or take complete precedence over "standard" bot priority weights. Obviously you could also assign skill sets, mana, health, levels and items for those heroes (and your hero). I think it might be worthwhile to allow the user to define certain "win conditions" as well.
To elaborate (and illustrate via example), imagine you want to practice your Puck play. You could set up three bots behind the tree line on the bottom lane and instruct them to all engage. It might be necessary to introduce a stochastic element to the bot behaviour to ensure that it wasn't too predictable. I'm guessing that the bots make decisions over some population density function, so that random elements in bot behaviour are probably there already? Anyway, you could practice your phasing, blinking, illusory-orbing, Eulsing, etc. all day long and repeat the scenario until you were satisfied. You could generate some incentive (some dopamine reward!) by assigning a win condition (e.g. get to this area, kill enemy hero z, make sure ally hero x survives, don't die) or some combination of objectives.
I really like the idea of such a system. It also might lend itself well to community design of various scenarios which could be made available through some channel. The best scenarios could be augmented to some official status. If the goal of some challenge was to maximise a particular variable (say you made a scenario which required you to farm as much GPM in x minutes with heroes a, b or c) there could be leaderboards that give the player an idea of where they stood in terms of skill at performing the particular mechanical skill the scenario is designed to test. Imagine being able to compare your performance in the 10 minute SF test to thousands of others (including pros). As an aside, the leaderboard idea is also kinda cool because it's a reminder that you're actually playing the same game as the pros! I'm not sure if I wanna know how unfavourably my 10 min SF test stacks up to the rest of the world, though! I think this sort of information should only be available to the player, with the top 100 or so being publicly visible (maybe the player can see a population density function with a little arrow pointing to their percentile or something, but I'm digressing here).
Obviously there would be some trade off between design simplicity and retaining the ability to test very specific skills but I think some sort of middle ground could be achieved. It would just require judicious design and a lot of community input. I think the track record on the first count is pretty excellent! Judicious design is Valve's bag. Community input is another thing, but I'm sure it will turn out fine, although I'm sure some creative chap will come up with a way to break the system or create something offensive. I have some ideas for intuitive user interfaces for the design component of this game mode (I suppose that's essentially what it is - "challenge mode" or "custom practise scenario") which I would be happy to elaborate if there was sufficient interest in the concept.
PS: Apologies if this has been mentioned before but it's pretty difficult to guess how a post detailing a similar concept would be titled (which makes searching for it challenging). Also apologies for the muddled style of the above text. I'm meant to be working (at my real job) and wrote this rather hastily.
Comment