Todays mice have a lot of buttons, but GNOME seems to act on the assumption that a mouse has only 3 buttons and a wheel. This is being reflected in the mouse configuration applet and generally in application design.
- User wants to use the side buttons of his mouse to navigate forward and back. (possible applications: nautilus, eog, epiphany)
- User wants to use the additional buttons of his mouse to show the desktop, or to perform Exposé-like window management (when it's available)
User wants to swap the behaviour of left <> middle click on sliders
It is possible use external tools like imwheel to make unused buttons emit keyboard events. (like Alt+Left/Right) But this approach has the disadvantage of bad integration; you cant use buttons configured like that to operate applets on gnome panel.
create an interface similar to gnome-keybinding-properties (or just extend the latter), which allows to bind some core events to mouse buttons or shortcuts. The applications would then listen for these events and not directly for the keypressed events.
Would mouse gestures be part of this task? Or would that be part of Metacity? - GustavoHexsel
Another use case
- I use my side button to change of workspace, and it's very usefull, it's like having very wide screen, i use it with compiz but i think it would be better that gnome manage it directly in order to be use side button without compiz for example.
- It would be good to be able to configure all buttons of mouse and attribute them a list of action pre-defined, or a command.
- If this project need a reflection about the interface, i can make schemes or something else.