-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Accepted with Revisions] SDL 0224 - Navigation Subscription Buttons #712
Comments
I am thinking about other map/nav controls that the proposal could be missing. What about:
|
| 1. I am a little nervous adding this ability which could enable OEMs to put these buttons on-screen over the projected navigation app's screen and cause all sorts of compatibility issues (as the author laid out). So I think we have to make sure to add some notes to the Core certification guidelines that would reflect this. |2. Should the Navigation Service have the button press added to it's handled RPCs with these new button types? |
|1. Yes, as I noted, these buttons should either be hard buttons, or be soft buttons that are located somewhere other than the map screen. They should never overlay the map screen. |2. Yes, I believe that would make sense. |
Regarding the below statement,
Mazda and many other OEMs still support hard button inputs along with or without the touch input. So in order to work with OEMs which still uses the hard buttons, the buttons names defined in this proposal should be used by the application. So I think that hard buttons should be considered with same priority as touch input. Also, I would like to propose some more actions to this proposal. User should be able to control all the functionalities of the Navigation app using the hard buttons. Currently this proposal suggests only the actions regarding the MAP screen. For eg, user should be able to open menu of Navigation application, scroll through the menu and select any item in the menu. Adding some more functions such as below would be sufficient.
In this scenario there is a case in which single hard button can be used for MAP operations and MENU operations. For eg. a knob can be used for ZOOMING the MAP in MAP view and scrolling over the MENU items in MENU view. So the actions should be differentiated according to the screens. This can be achieved by using new enums such as
By setting the enum MAP_FULL_SCREEN_VIEW indicates that the view is now MAP and hardware buttons can be used for MAP operations. If the enum MAP_LIST_VIEW is set, the view is MENU and hardware buttons can be used for MENU operations. This setting of enum should be done from the application if the application supports the hard buttons. |
Thank you for the review @tonyaug. I'd like to clear a few things up.
My statement above was intended to say that if a head unit supports touch, it should use touch as the primary input interaction, and only support the hard buttons as a secondary feature.
So, this requires several responses, but I think the criticism here is generally misguided.
I believe that covers all of your concerns. I do not think any changes need to be made to this proposal. |
Thank you for the reply @joeljfischer . I am sorry for being naive about the comments. We at Mazda intend to make SDL projection Navigation applications to work with only hard buttons. So this proposal is very crucial in that context. We were making a similar proposal. But most of the items in our proposal are already covered in this proposal. We thought it is good if we add our suggestions too in this proposal. Regarding your comments,
So the app developer will be adding the menu using the
As per this proposal by calling the API Please confirm above understanding is correct or not. In our scenario the MENU hard button will be pressed and the action will be notified to HU. Is it possible to the HU to call the API We were hoping the menu should be in the application. When user press the hard button to open the menu, application will open the menu and user can change the focus and select the menus using rotating and clicking the hard button.
https://smartdevicelink.com/en/guides/android/mobile-navigation/supporting-haptic-input/ So as per comment 2 and 3 the app developer need to provide haptic rect data for each menu item. OEM need to highlight a rect when that rect is focused and inform application when the rect is pressed using the hard button. Please confirm my understanding? Can you please provide more insight to the below sentence,
Is there any documentation on how OEM can achieve this when using the haptic inputs? In this case also our understanding of the functionality was, application will be changing the focus of the menu item as per user action on the hard button. We were hoping that, there can be some guidelines defined in order to keep the consistency among different app developers.
|
|2. Yes, your understanding is correct, and that is the way I would recommend it work by far because the performance will be far greater. However, if the app wants to use a video projection menu, that is currently possible using the haptic rect system that currently exists. |3. Your understanding appears to be correct. Regarding my comment:
There is no current documentation, because it's completely up to the OEM's HMI. There is no SDL API integration necessary to achieve this. Your HMI will have access to the rectangles that the developer claims can be highlighted and selected. It is then up to you to decide how to highlight those rectangles and which one to highlight when. When the user selects a rect, you send a tap to the app at the center of the rect.
The application does not change its focus, therefore there are no guidelines. It's completely up to the OEM to ensure that focusing works the same within SDL as it does on the rest of the head unit. |4. The problem is that there is no concept of "Back" in SDL. There is no standard SDL "go back to the previous screen." We could create it special for video projection apps, but it wouldn't work the same as elsewhere in SDL. Does anyone else have an opinion on a special back button for video projection navigation use cases when such a hard button exists in the car? |
During the 2019-05-07 meeting, the Steering Committee voted to defer this proposal, keeping it in review until our meeting on 2019-05-14. This will allow time for additional discussion on the issue between the author and SDLC members. It also was determined that the final proposal will incorporate this comment, with NAV_ROTATE including rotating left or right 90 degrees. The final proposal will also be updated to have button press added to the Navigation Service's handled RPCs with these new button types. |
Hi @joeljfischer Please find my comments. |2. Thank you for confirming my understanding. But I am still not clear about the menu implementation. Hope you have seen the below question from my previous comment,
And your reply,
My understanding from your comment is that the app developer need to place a haptic rect in the MAP view. OEM HMI need to focus the haptic rect and send button press to application when user press the MENU hard button. Then the application will open the menu screen by calling the API I think that, sending a function like MENU from HMI to application and application call the API |3. I understand you comment. But one of the drawback of using the Haptic rect as mentioned in the documentation is that,
I think that this case conflicts with our requirements in the case of using the Haptic rect. All the available menus in the application should be available to the user to operate via hard controls. If an application have menus more than the number of the list items which can be displayed in the HMI (without scrolling), using haptic rect will not work. Kindly consider this case and provide your comment on this? |4. We would really like to have the BACK button function. If the user opens the MENU view from MAP view, and press the BACK button, the screen should go back to the MAP view. Also I hope that with the current available APIs it is possible to perform the closing of MENU view from application. |
|2. I must have missed your comment:
I don't believe your understanding is correct here. The proposal SDL-0116 I referenced above gives the app a way to programmatically open the menu by telling the head unit to do so using The purpose of SDL-0116 is to give a video projection app a way to programatically open the menu from their side. This is so that they can have their own "menu" button on their video projection and open the app menu from that. But this would not affect you if you already have your own "menu" button, unless the app creates a video projection menu, which they are not encouraged to do. So, to make a long response short, if you want a hard button to open the app menu, that's already possible. Your hard button just has to tell your HMI to open the app menu. |3. This is correct. You cannot "scroll" purely using haptic rect. There would have to be an "up arrow" or "down arrow" button on screen that could be highlighted. I can think of a use-case for a subscription button like With that said, using the standard SDL templates for activities outside of the map view is, in my mind, strongly preferred for performance and UI cohesion reasons. |4. See my comments on (2) for the ability to close the menu from a hard button. You can already do that by interacting with your HMI, just as other OEMs do when the user touches the "back" or "close" button on the menu screen.
I'm not entirely sure what you mean by this, but no, a proxy app cannot currently programmatically close the menu. This is a bad idea because it means that the app could accidentally close the menu when the user is not expecting it. There's no reason for such a feature to exist because closing the menu is always user-initiated and the HMI takes care of closing the menu in such a case. So, I don't see a reason for a "back" hard button (even in navigation applications because they ought to be using the standard SDL templates outside of the map view wherever possible). That only place such a concept currently exists in SDL (such as the menu, or when an alert appears), there are already APIs in place to do this. If this is still desired, I would request that you propose it in a separate proposal and provide your reasoning for it so that it can be a focused discussion and so that it does not delay this proposal further. |
HI @joeljfischer |2. I think that I understand you comment. So if the app developer is using calling Also regarding the below statement,
Can you provide any information on how to do this? |3. Thank you for the suggestion. We will check this case with
|4. we will check the back button case as per you comment.
I have one more point to be noted. As I mentioned in this comment
I would like to confirm the use of |
|2.
That is outside of the scope of SDL. It would happen in any other way that you might signal your HMI a message when you press a button. |4.
I'm not sure what you mean here. The app would differentiate the buttons on the different screens to perform an appropriate action when they receive the button press. |
Thank you @joeljfischer for your comments and suggestions. We think we can achieve our requirements as per your suggestion. |
The Steering Committee voted to accept this proposal with the following revisions:
Additionally, there was a question in the Steering Committee meeting about requiring applications to be compatible with this proposal. It was advised that this requirement is out of the scope of SDL, and OEMs would need to ask application developers to implement this proposal once it's available in the library for developers to implement. |
@joeljfischer please advise when a new PR has been entered to update the proposal to reflect the agreed upon revisions. I'll then merge the PR so the proposal is up to date, and update issues in the respective repositories for implementation. Thanks! |
Proposal has been updated to reflect revisions, and issues have been entered: |
Hello SDL community,
The review of "SDL 0224 - Navigation Subscription Buttons" begins now and runs through May 7, 2019. The proposal is available here:
https://github.com/smartdevicelink/sdl_evolution/blob/master/proposals/0224-navigation-subscription-buttons.md
Reviews are an important part of the SDL evolution process. All reviews should be sent to the associated Github issue at:
#712
What goes into a review?
The goal of the review process is to improve the proposal under review through constructive criticism and, eventually, determine the direction of SDL. When writing your review, here are some questions you might want to answer in your review:
Please state explicitly whether you believe that the proposal should be accepted into SDL.
More information about the SDL evolution process is available at
https://github.com/smartdevicelink/sdl_evolution/blob/master/process.md
Thank you,
Theresa Lech
Program Manager - Livio
[email protected]
The text was updated successfully, but these errors were encountered: