You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Aug 23, 2022. It is now read-only.
using the latest Dockerfile with the changes from the cmake refactor, I've been building the image and packaging it with BinaryNinja successfully. However, as of #666, support for using BinaryNinja with mcsema-disass appears to be removed. Was there a reason for this and are there any plans to add support for it again?
The documentation still refers to BinaryNinja in a number of places, as does the code itself.
The text was updated successfully, but these errors were encountered:
Hello! Yes we dropped Binary NInja support because it wasn't being actively maintained, and because it had gotten out of date with the cross-reference lifting approach taken in the cmake_refactor branch (since merged to master). Our official supported disassembler is IDA Pro >= 7.0. I will update the documentation to reflect this.
I think it needs to be thoroughly debugged :-) I think we want to keep it around as it's a good option for open-source users. Ideally, I think we'd eventually want to move to using LLVM's own object file loading APIs and then implement some kind of basic disassembly functionality on top of that (porting the Dyninst code to use the LLVM APIs or something like that).
using the latest Dockerfile with the changes from the cmake refactor, I've been building the image and packaging it with BinaryNinja successfully. However, as of #666, support for using BinaryNinja with mcsema-disass appears to be removed. Was there a reason for this and are there any plans to add support for it again?
The documentation still refers to BinaryNinja in a number of places, as does the code itself.
The text was updated successfully, but these errors were encountered: